© 2024 Blue Ridge Public Radio
Blue Ridge Mountains banner background
Your source for information and inspiration in Western North Carolina.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Tesla Driver Was Playing Game Before Deadly Crash. But Tesla Software Failed, Too

The front of a Tesla Model X in Los Angeles in 2017.
Justin Sullivan
/
Getty Images
The front of a Tesla Model X in Los Angeles in 2017.

Irresponsibility — by carmaker Tesla and by a Tesla driver — contributed to a deadly crash in California in 2018, federal investigators say.

The driver appears to have been playing a game on a smartphone immediately before his semi-autonomous 2017 Model X accelerated into a concrete barrier. Distracted by his phone, he did not intervene to steer his car back toward safety and was killed in the fiery wreck.

But Tesla should have anticipated that drivers would misuse its "autopilot" feature like this and should build in more safeguards to prevent deadly crashes.

That's according to the National Transportation Safety Board, which spent nearly two years investigating the crash.

Tesla's advanced driver assistance software is called "Autopilot." That suggests the car can steer autonomously, but the system is limited and drivers are supposed to pay attention so they can take control from the car if necessary.

"When driving in the supposed self-driving mode, you can't sleep. You can't read a book. You can't watch a movie or TV show. You can't text. And you can't play video games," Robert L. Sumwalt, chairman of the NTSB, said Tuesday.

But the NTSB did not solely blame the driver, Apple engineer Walter Huang, for the crash. It was also highly critical of Tesla for failing to anticipate and prevent this misuse of technology.

After all, there's video evidence that Tesla drivers using Autopilot do sleep, text, and, like Huang, play video games. Owners swap tips on forums about how to trick the software into thinking they're holding the steering wheel.

In the case of Huang's crash, the vehicle's software had noticed that he did not have his hands on the wheel at the time of the crash. Still the SUV merely warned him to pay attention, rather than disabling the semi-autonomous steering.

Tesla also allows its Autopilot system to be used on roadways that the software is not designed to handle, creating safety risks, the NTSB says.

Other carmakers have similar issues with their advanced driver assistance features, the NTSB found. But only Tesla has failed to respond to the board's new recommendations.

The board also critiqued Apple, Huang's employer, for not prohibiting employees from using devices while driving. Huang was a game developer and was using his company-issued work phone at the time of the crash.

Highway maintenance also played a role in the severity of the crash, NTSB has previously said. A metal "crash cushion" should have softened the blow of the collision, but it had been damaged in a previous crash and was no longer effective.

The same barrier had been hit repeatedly over several years, including another crash that caused a fatality, and often went unrepaired for long stretches of time, according to NTSB documents.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Camila Flamiano Domonoske covers cars, energy and the future of mobility for NPR's Business Desk.