CarsConceptCoupesElectricExoticHybridMotorcyclesSedansSports CarsSUVsTrucks

Teslas with Autopilot under NHTSA Investigation, Recall Possible

Teslas with Autopilot under NHTSA Investigation, Recall Possible

Michael SimariCar and Driver

  • NHTSA opened a probe into Tesla’s Autopilot software last fall, then asked for more information, and is now expanding its investigation to an engineering analysis, which could lead to a recall.
  • The problem under investigation is how Tesla’s driver-assistance software identifies potential incidents with stopped first responder vehicles, as well as how the cars alert the drivers to these problems.
  • Over 800,000 vehicles are potentially affected, including Model S vehicles built between 2014 and 2021, Model X (2015–2021), Model 3 (2018–2021) and Model Y (2020–2021).

    The National Highway Traffic Safety Administration (NHTSA) will take a deeper look into how Tesla vehicles equipped with so-called Autopilot driver assistance software navigate when interacting with first responder vehicles at the scene of a collision. NHTSA said this week that it is upgrading the Preliminary Evaluation it started last August into an Engineering Analysis, which is the next step in a possible recall of hundreds of thousands of Tesla vehicles.

    NHTSA said in its notice that it was motivated to upgrade the status of the investigation because of “an accumulation of crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes.”

    What Level 2 Means

    NHTSA said that Tesla itself characterizes Autopilot as “an SAE Level 2 driving automation system designed to support and assist the driver,” and many automakers use some sort of Level 2 system in their new vehicles. In fact, as part of NHTSA’s probe last fall, it asked Tesla and a dozen other automakers for information on how their Level 2 systems operate.

    Based on public information as of today, NHTSA is now only interested in understanding Tesla Autopilot performance. NHTSA followed up its August information request with a request for more information last October, specifically about how Tesla makes changes to Autopilot using over-the-air updates as well as the way Tesla requires non-disclosure agreements with owners whose vehicles are part of Tesla’a so-called Full Self-Driving (FSD) “beta” release program. Despite the name, FSD is not actually capable of actually driving the car on its own.

    In a public update on its probe, NHTSA laid out its case for why Autopilot needs to be investigated. NHTSA said it has so far investigated 16 crashes and found that Autopilot only aborted its own vehicle control, on average, “less than one second prior to the first impact” even though video of these events proved that the driver should have been made aware of a potential incident an average of eight seconds before impact. NHTSA found most of the drivers had their hands on the wheel (as Autopilot requires) but that the vehicles did not alert drivers to take evasive action in time.

    100 Other Crashes to Get a Second Look

    NHTSA is also reviewing more than 100 other crashes that happened with Teslas using Autopilot but that did not involve first responder vehicles. Its preliminary review of these incidents shows that in many case, the driver was “insufficiently responsive to the needs of the dynamic driving task.” This is why NHTSA will use its investigation to assess “the technologies and methods [Tesla uses] to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”

    A total of 830,000 Tesla vehicles are part of the upgraded investigation. That includes all of Tesla’s current models, including Model S vehicles built between 2014 and 2021, Model X (2015–2021), Model 3 (2018–2021) and Model Y (2020–2021). NHTSA’s documents say it is aware of 15 injuries and one fatality related to the Autopilot first responder problem.

    Sen. Ed Markey of Massachusetts tweeted that he’s glad NHTSA is escalating its probe, because “every day that Tesla disregards safety rules and misleads the public about its ‘Autopilot’ system, our roads become more dangerous.”

    Tesla CEO Elon Musk is still touting the benefits of Full Self-Driving (FSD) and announced the expansion of the latest beta software to 100,000 cars earlier this month on Twitter. He claimed that the new update will be able to “handle roads with no map data at all” and that “within a few months, FSD should be able to drive to a GPS point with zero map data.”

    This content is imported from Twitter. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

    The Autopilot investigation is separate from another recent move by NHTSA to request more information from Tesla about “phantom braking” caused by the company’s automated emergency braking (AEB) systems. The company has until June 20 to submit documents about hundreds of reported AEB problems to the government.

    This content is imported from {embed-name}. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

    This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io

//platform.twitter.com/widgets.js


#Teslas #Autopilot #NHTSA #Investigation #Recall

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button