Fatal crash renews concerns over Tesla’s ‘autopilot’ claim


The National Highway Traffic Safety Administration – the government agency that oversees auto safety – and the National Transportation Safety Board – an independent agency that investigates notable incidents – sent teams to Texas to investigate the crash. “We are actively engaged with local law enforcement and Tesla to learn more about the details of the crash and will take appropriate action when we have more information,” NHTSA said in a statement. It will likely take weeks, if not months, before the results of an investigation are released.

Yet the incident once again highlights the still-yawning gap between Tesla’s commercialization of its technology and its true capabilities, highlighted in on-board dialogs and owner’s manuals.

There is a small cottage industry of videos on platforms like YouTube and TikTok where people try to “trick” the autopilot into driving without an attentive driver in the front seat; some videos show people “sleeping” in the back or driving. Tesla owners have even demonstrated that once the driver’s seat belt is secured, someone can invite a car on autopilot to travel for a few seconds with no one behind the wheel.

Tesla – and in particular Musk – have a mixed history of public statements about autonomous driving and autopilot. A Tesla on autopilot gives visible and audible warnings to drivers if its sensors don’t detect the pressure of their hands on the steering wheel every 30 seconds or so, and it will stop if it does not detect hands for a minute. But for a 60 minutes appearance in 2018, Musk sat behind the wheel of a moving Model 3, leaned back and put his hands on his knees. “Now you don’t drive at all,” the anchor said in surprise.

This month, Musk told podcaster Joe Rogan, “I think autopilot is getting good enough that you don’t need to drive most of the time unless you really want to. The CEO has also repeatedly given optimistic assessments of his company’s progress in autonomous driving. In 2019, he promised that Tesla would have 1 million robotaxis on the road by the end of 2020. But in the fall of 2020, company representatives written to the California Department of Motor Vehicles to assure them that full self-driving “will remain largely unchanged in the future” and that the FSD will remain an “advanced driver assistance function” rather than a stand-alone function.

So far, FSD has only released to around 1,000 participants in the company’s beta testing program. “Always be careful, but it’s getting mature,” Musk tweeted last month to FSD beta testers.

At least three people have died in fatal crashes involving autopilot. After an investigation into a 2018 fatal crash in Mountain View, Calif., The NTSB asked the federal government and Tesla to ensure that drivers can only use Tesla’s automated safety features in places where they are safe. He also recommended that Tesla install a more robust monitoring system, to make sure drivers pay attention to the road. General Motors, for example, will only allow users of automated SuperCruise function to operate on pre-mapped roads. A driver-facing camera also detects whether drivers’ eyes are pointing towards the road.

A spokesperson for NHTSA said the agency has opened an investigation into 28 Tesla-related crash incidents.

Data released by Tesla suggests the vehicles are safer than the average American car. Saturday, hours before the fatal Texas crash, Musk tweeted that Teslas with autopilot engaged are almost 10 times less likely to crash than the average vehicle, as measured by federal data. But experts point out that the comparison is not quite appropriate. Autopilot is only supposed to be used on highways, while federal data captures all kinds of driving conditions. And Teslas are heavy luxury cars, which means they’re safer in the event of an accident.


More WIRED stories



Leave a Reply

Your email address will not be published. Required fields are marked *