Tesla’s Autopilot System Is ‘Easily’ Tricked Into Operating Without Anyone at the Wheel
Consumer Reports said Thursday its researchers “easily” tricked a Tesla’s Autopilot system into driving without anyone at the wheel. The publication’s worrying demonstration comes as officials continue to investigate a fatal crash in Texas involving a Tesla that authorities believe had no one in the driver’s seat at the time.
Using a Tesla Model Y, Consumer Reports engineers were able to “drive” around on a closed-course test track while seated in the front passenger seat and backseat. To fool the car’s driver assistance system, they attached a weighted chain to the steering wheel to simulate the pressure of a driver’s hands and used the steering wheel speed dial to accelerate from a full stop. As long as they kept the driver’s side door closed and the driver’s side seat belt buckled (so that the system didn’t disengage automatically), the vehicle continued to drive up and down the half-mile track and follow painted lane lines during the experiment, apparently none the wiser.
“It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient,” said Jake Fisher, the publication’s senior director of auto testing who conducted the experiment, in a statement.
“In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all,” he continued. “Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road.”
Don’t try any of this at home though, Fisher warned, emphasizing that the Consumer Reports team conducted its tests on a closed course at relatively low speeds with safety crews on standby.
G/O Media may get a commission
“Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger,” he said.
The vehicle involved in Saturday’s fatal crash was reportedly a Tesla Model S, a different model from the one Consumer Reports used in its experiment. However, both use the same Autopilot system, the publication notes.
On Tesla’s support page for the system, the company discloses that its cars are not fully autonomous. It also warns that, despite their namesakes, its Autopilot and Full Self-Driving features require “active driver supervision.”
But those warnings haven’t stopped Tesla drivers from handing control over to their car’s Autopilot system while they sleep, change seats, or otherwise take their eyes off the road. In 2018, California police pulled over a driver in a Tesla Model S who was drunk and asleep at the wheel while his car sped along by itself at 70 miles per hour (112 kilometers per hour). A similar incident happened in Canada last September. A Tesla Model S owner was charged with dangerous driving after he was found asleep at the wheel while traveling at 93 miles per hour (150 kilometers per hour) down the highway.
And Saturday’s crash isn’t an isolated incident. The National Highway Traffic Safety Administration, America’s car safety regulator, has reportedly opened at least 14 investigations into Tesla crashes in which the vehicle’s Autopilot system is suspected of being used. This week, the NHTSA announced it was also sending a team to investigate the crash in Texas.
Source link