Two Are Dead After a Tesla Believed to Be Driverless Crashes
Two men died after the Tesla they were riding in, which local authorities believe had no one in the driver’s seat, crashed into a tree and burst into flames in the north of Houston on Saturday.
Authorities said that the Tesla, which was only identified as a 2019 model, was traveling at a high speed and failed to turn correctly at a cul-de-sac, running off the road and hitting a tree, local station KPRC reported. There was a person in the car’s front passenger seat and in its rear passenger seat.
Mark Herman, the precinct four constable for Harris County, told the Wall Street Journal that authorities were still investigating whether the airbag for the front passenger seat deployed. They are also determining whether the car’s driver assistance system, or Autopilot, was activated at the time of the crash.
Herman said that their investigation so far is indicating that no one was driving the Tesla.
“Our preliminary investigation is determining—but it’s not complete yet—that there was no one at the wheel of that vehicle,” Herman said. “We’re almost 99.9% sure.”
G/O Media may get a commission
The Tesla burned for hours, according to KPRC, with authorities having to use 32,000 gallons of water to extinguish the fire because the car’s batteries kept catching fire. Law enforcement even resorted to calling Tesla to ask company officials how to put out the fire caused by the batteries.
Gizmodo reached out to Tesla for comment on the incident on Sunday, but we did not receive a response by the time of publication. We’ll make sure to update this blog if we do, although it should be noted that Tesla sacked its press team a few months ago.
The incident underscores the current limitations of Tesla’s Autopilot system and also brings attention to the confusion over it.
On a support page on its website dedicated to the system—which comes in two packages, Autopilot and Full Self-Driving—Tesla affirms that its cars are not fully autonomous. Tesla goes on to say that Autopilot and Full Self-Driving features require “active driver supervision.”
“Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment,” Tesla says on its support page. “While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”
Nonetheless, as pointed out by Jalopnik, with names like “Full Self-Driving,” there is now a misconception that Tesla’s cars are capable of carrying out fully autonomous actions and that the driver is free to sleep, change seats, or take their hands off the wheel for long periods of time. In 2018, police pulled over a driver in a Tesla Model S who was drunk and asleep at the wheel with autopilot enabled. The car was driving itself at 70 miles per hour.
That same year, the driver of a Tesla Model S who had Autopilot activated crashed into an empty Ford Fiesta. The driver, who said he was looking at his phone at the time of the crash, sued the company for misleading him into thinking the car could operate with “minimal input and oversight.”
These cases are not isolated. The National Highway Traffic Safety Administration, the country’s car safety regulator, in early 2020 had opened 14 investigations into Tesla crashes involving its Autopilot system.
The Journal notes that Tesla critics say the company doesn’t do enough to keep drivers from depending too much on its Autopilot systems or from using them inappropriately. The NHTSA does not have any rules that dictate how manufacturers must monitor driver engagement.
Source link