Tesla says it is 'morally obligated' to continue improving Autopilot, reiterates safety claims
In response to a Washington Post investigation of serious crashes involving Autopilot on roads where the feature could not reliably operate, the company said its data showed it was saving lives and preventing injuries
US automaker Tesla Inc on Monday said it has a "moral obligation" to continue improving its Autopilot driver assistant system and make it available to more consumers based on data that showed stronger safety metrics when it was engaged.
In response to a Washington Post investigation of serious crashes involving Autopilot on roads where the feature could not reliably operate, the company said its data showed it was saving lives and preventing injuries.
The Post report said the newspaper had identified at least eight crashes between 2016 and 2023 where Autopilot could be activated in situations it was not designed to be used, and said Tesla had taken few definitive steps to restrict its use by geography despite having the technical capability to do so.
Autopilot is "intended for use on controlled-access highways" with "a center divider, clear lane markings, and no cross traffic," the Post said, adding Tesla's user manual advises drivers the technology can also falter on roads if there are hills or sharp curves.
The Post investigation "leverages instances of driver misuse of the Autopilot driver assist feature to suggest the system is the problem", Tesla said in a post on social media platform X, adding that Autopilot was about 10 times safer than the US average and 5 times safer than a Tesla without the technology enabled.
The company also reiterated that the driver remained responsible for control of the vehicle at all times and is notified of this responsibility.
The Post said regulatory bodies like the US National Highway Traffic Safety Administration (NHTSA) had not adopted rules to limit the technology to where it is meant to be used despite opening investigations into the software after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles.
NHTSA did not respond immediately to a request for comment from Reuters outside normal business hours. The agency told the Post it would be too complex and resource-intensive to verify that systems like Autopilot were used within the conditions for which they are designed, and it potentially would not fix the problem.
Last month, a Florida judge found "reasonable evidence" that Tesla Chief Executive Elon Musk and other managers knew the automaker's vehicles had a defective Autopilot system but still allowed the cars to be driven unsafely.
The ruling came as a setback for Tesla after the company won two product liability trials in California this year over the Autopilot system.