Future Tech

Tesla to remote patch 2M vehicles after damning Autopilot safety probe

Tan KW
Publish date: Thu, 14 Dec 2023, 08:17 AM
Tan KW
0 460,898
Future Tech

The US National Highway Traffic Safety Administration's (NHTSA) investigation into safety risks associated with Tesla's Autopilot have concluded with a recall of more than two million vehicles, with the agency determining Autopilot's safety controls are "insufficient to prevent misuse." 

Some 2012-2023 Tesla Model S vehicles equipped with all versions of Autosteer were included in the recall [PDF], along with "all" Model X, 3 and Y vehicles manufactured, respectively, between 2016, 2017 and 2020 to the present. The NHTSA estimated that 100 percent of Teslas are affected, totaling 2,031,220 vehicles. 

"In certain circumstances when Autosteer is engaged, and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged, there may be an increased risk of a crash," the NHTSA said [PDF]. 

The software recall brings an end to a two-year probe by the NHTSA that began in 2021 after a number of highly-publicized accidents, including several involving Teslas with Autopilot activated hitting emergency vehicles on the side of highways. 

The NHTSA upgraded its investigation to an engineering analysis last year after finding reasons to believe "Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver's supervision," the agency said in 2022. 

After reviewing 956 crashes where Autopilot was alleged to have been in use, 322 merited particular focus, and the NHTSA appears to have concluded that its initial suspicions were correct.

"In certain circumstances when Autosteer is engaged, the prominence and scope of the feature's controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature," the NHTSA said in a letter to Tesla confirming receipt of its safety recall. The NHTSA didn't respond to questions for this story. 

SAE level 2 systems require drivers to maintain attention. Tesla has been accused of misleading buyers of the capabilities of its Autopilot advanced driver assist system (ADAS) through its name and the company's use of Full Self Driving (FSD) as the name for a more advanced variant. 

Tesla said it didn't concur with the NHTSA's findings, but on December 5 decided to issue a voluntary recall "in the interest of resolving" the investigation.  

An update to Autopilot software will be delivered to affected vehicles "on or shortly after December 12," Tesla said in its recall notice to the NHTSA. The recall will  "incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility," Tesla said. 

Additional controls, which will vary based on available vehicle hardware, may include more prominent visual alerts, a simpler method of engaging and disengaging Autopilot, additional checks when using the feature "outside of controlled access highways" and suspension of Autopilot if the driver "repeatedly fails to demonstrate continuous and sustained driving responsibility." 

This is the second Autopilot recall Tesla has issued this year, with another OTA update issued in February after the NHTSA said Autopilot had a tendency to "act unsafe around intersections" and ignore changes in speed limits. 

The Register has asked Tesla for comment on this story, but hasn't heard back.

While the investigation has concluded, the NHTSA said it will leave the case open to "support an evaluation of the effectiveness" of the Autopilot update being deployed to address the issue. The NHTSA has asked Tesla to submit eight consecutive quarterly reports, followed by three annual ones, to demonstrate improved safety. ®

 

https://www.theregister.com//2023/12/13/tesla_recall_autopilot_safety/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment