Home / Cars / Cars: Tesla Driver Was Playing Game Before Deadly Crash. But Tesla Software Failed Too

Cars: Tesla Driver Was Playing Game Before Deadly Crash. But Tesla Software Failed Too

Cars:

Cars:

The front of a Tesla Model X in Los Angeles in 2017.

Justin Sullivan/Getty Images


hide caption

toggle caption

Justin Sullivan/Getty Images

Cars:

The front of a Tesla Model X in Los Angeles in 2017.

Justin Sullivan/Getty Images

Irresponsibility — by carmaker Tesla and by a Tesla driver — contributed to a deadly crash in California in 2018, federal investigators say.

The driver appears to have been playing a game on a smartphone immediately before his semi-autonomous 2017 Model X accelerated into a concrete barrier. Distracted by his phone, he did not intervene to steer his car back toward safety and was killed in the fiery wreck.

But Tesla should have anticipated that drivers would misuse its “autopilot” feature like this and should build in more safeguards to prevent deadly crashes.

That’s according to the National Transportation Safety Board, which spent nearly two years investigating the crash.

Tesla’s advanced driver assistance software is called “Autopilot.” That suggests the car can steer autonomously, but the system is limited and drivers are supposed to pay attention so they can take control from the car if necessary.

“When driving in the supposed self-driving mode, you can’t sleep. You can’t read a book. You can’t watch a movie or TV show. You can’t text. And you can’t play video games,” Robert L. Sumwalt, chairman of the NTSB, said Tuesday.

Cars: No Driver Input Detected In Seconds Before Deadly Tesla Crash, NTSB Finds

But the NTSB did not solely blame the driver, Apple engineer Walter Huang, for the crash. It was also highly critical of Tesla for failing to anticipate and prevent this misuse of technology.

After all, there’s video evidence that Tesla drivers using Autopilot do sleep, text, and, like Huang, play video games. Owners swap tips on forums about how to trick the software into thinking they’re holding the steering wheel.

In the case of Huang’s crash, the vehicle’s software had noticed that he did not have his hands on the wheel at the time of the crash. Still the SUV merely warned him to pay attention, rather than disabling the semi-autonomous steering.

Tesla also allows its Autopilot system to be used on roadways that the software is not designed to handle, creating safety risks, the NTSB says.

Other carmakers have similar issues with their advanced driver assistance features, the NTSB found. But only Tesla has failed to respond to the board’s new recommendations.

Cars: NTSB: Tesla Booted From Crash Investigation For Not Following Rules

The board also critiqued Apple, Huang’s employer, for not prohibiting employees from using devices while driving. Huang was a game developer and was using his company-issued work phone at the time of the crash.

Highway maintenance also played a role in the severity of the crash, NTSB has previously said. A metal “crash cushion” should have softened the blow of the collision, but it had been damaged in a previous crash and was no longer effective.

The same barrier had been hit repeatedly over several years, including another crash that caused a fatality, and often went unrepaired for long stretches of time, according to NTSB documents.

Read More

Share

About admin

Check Also

Cars: Honda will discontinue its Clarity EV in 2020

Cars: Honda will discontinue its Clarity EV in 2020

This isn't totally surprising, as the Clarity EV had a pretty limited range. It only traveled about 89 miles on a charge. At the time of its debut, it seemed that Honda was betting on luxury over distance. But many other EVs offer both, and it looks like that bet didn't work out.It is a…

Leave a Reply

Your email address will not be published. Required fields are marked *