A Consumer Reports test driver has demonstrated in a new video that, yes, a Tesla vehicle with the Autopilot feature will drive with no one in the driver’s seat.
The question came up after a fatal accident in Texas last weekend, said Jake Fisher, head of auto testing at Consumer Reports. In that crash, according to police, no one was in the driver’s seat of a Tesla Model S when it crashed at high speed, killing two passengers.
On Monday, Tesla CEO Elon Musk tweeted about the accident, saying “Data logs recovered so far show Autopilot was not enabled,” and that “Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.” Musk also said that the car was not equipped with the more advanced “Full Self-Driving” feature.
It was not clear from Musk’s tweet whether he meant the Autopilot system was not enabled at the moment of impact or that it had not been used at all in that fatal drive.
Tesla has repeatedly faced allegations that the name of its “Autopilot” driver assistance system is misleading some to believe the car can indeed drive safely by itself.
But Consumer Reports wasn’t trying to necessarily recreate the Texas crash, Fisher said. That incident is still being investigated so it’s not known exactly what happened. Instead, Fisher said, Consumer Reports investigators were trying to show how driver assistance systems like this – which could include those from companies other than Tesla – are open to abuse without more effective driver monitoring.
“If a system can’t even tell if a driver is in the seat, it’s clearly insufficient,” he said.
The researchers at Consumer Reports said they found that it would be easy to trick Tesla’s Autopilot feature into driving without a driver in the driver’s seat.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” said Fisher, who conducted the experiment.
The demonstration was done at the consumer advocacy group’s Connecticut test track where there were no other cars, no pedestrians and no trees or other obstacles the vehicle could hit if things went badly wrong, the report said. That part of Consumer Reports’ track is specifically designed to test driver assistance systems, such as Autopilot.
Fisher reported that he started out the trial in the driver’s seat of a Tesla Model Y. After turning on the car’s Autopilot system while the car was moving, Fisher said he reduced the car’s speed to zero then moved over to the passenger seat. He reached over from there to increase the car’s speed using a dial on the steering wheel. With the Autopilot system set to 30 miles per hour, the vehicle remained on course. It then continued to drive without anyone in the driver’s seat. Fisher said the system did not send out a warning or indicate in any way that the driver’s seat was empty.
Tesla vehicles do have a weight sensor in the driver’s seat, Fisher said, which is used by the vehicle to recognize when the seat is occupied so it can turn the vehicle on, as it does not have a traditional key or start button. But the seat sensor is not, evidently, tied into the Autopilot system to prevent the system from being used with no driver, he said.
“I was actually a bit shocked at how easy it was to do what I did,” Fisher said.
Instead, the system relies on occasionally sensing a driver’s hand on the steering wheel to ensure that a driver is present and engaged in driving. To get around this, Fisher said he hung a weighted rope on one side of the steering wheel to gently pull on the wheel.
As Fisher’s rope trick shows, it’s easy to mimic a continuous tug on the steering wheel. In 2018, the National Highway Traffic Safety Administration banned a device which was designed specifically to trick such a system into thinking someone was holding a steering wheel when they weren’t. It was simply a weight designed to hang from the steering wheel.
Other cars with advanced driver assistance systems also rely on steering wheel movement to assess a driver’s involvement. It is possible that systems like this on other cars that do not include sophisticated driver monitoring could also be abused, Kelly Funkhouser, Consumer Reports’ program manager for vehicle interface testing, said in the report.
Tesla, which has generally not responded to press inquiries for over a year, has not responded to a request for comment on Consumer Reports’ findings, nor on the meaning of Musk’s tweet.
The NHTSA and the National Transportation Safety Board are investigating last weekend’s crash.
Investigators have not said whether Tesla’s Autopilot or the more advanced “Full Self-Driving” feature was involved in the crash.
According to Tesla’s website and public statements, these systems are not intended to actually drive a car without human involvement.
Consumer Reports’ testing team recommended that Tesla use the front seat’s weight sensor to ensure that a driver is in the seat when Autopilot is used. William Wallace, manager of safety policy at Consumer Reports, also suggested that driver monitoring systems, which use cameras to ensure drivers are paying attention to the road, should be required safety equipment. Such systems are used today in combination with driver assistance systems in a number of cars. Auto safety regulators in Europe plan to start requiring them in 2023.