Tesla Sold ‘False Sense Of Security’ To Employee Killed In Suspected Full Self-Driving Crash

Tesla Sold ‘False Sense Of Security’ To Employee Killed In Suspected Full Self-Driving Crash

Tesla’s Full Self-Driving software can’t drive your car by itself. Despite its name, the advanced driver assist program requires drivers keep their eyes on the road at all times and be prepared to take the wheel at a moment’s notice. Now, the widow of a driver killed in a crash that involved FSD has accused the automaker of selling a “false sense of security” with the software.

The Hype Behind Tesla Stock Success In 2023

In 2022, Tesla employee Hans von Ohain was driving his Model 3 electric vehicle alongside Erik Rossiter. The pair had been out playing golf one afternoon and had a few drinks before heading home. On the drive back, von Ohain reportedly let his Tesla take control of the ride by initiating its FSD software, reports the Washington Post.

However, the drive ended in disaster when the Model 3 careered off the road and burst into flames, killing von Ohain and injuring Rossiter. As the Post explains:

“The Tesla Model 3 barreled into a tree and exploded in flames, killing von Ohain, a Tesla employee and devoted fan of CEO Elon Musk. Rossiter, who survived the crash, told emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road,” according to a 911 dispatch recording obtained by The Washington Post. In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology.”

While no crash has so far been definitively linked to the FSD program, the Post identified numerous other collisions in which drivers claimed the software was initiated. This included a 2022 crash that caused a massive pileup in San Francisco and at least two serious crashes, including the crash that killed von Ohain.

See also  Denim Livery For Singapore Grand Prix Gives Us The First Jormula 1 Car

Thousands of Tesla cars are now equipped with Full Self-Driving. Photo: Shen Chunchen/VCG (Getty Images)

Von Ohain’s crash is complicated, though, as a postmortem following his death revealed that he was three times over the legal blood alcohol limit to drive. Still, police investigating the crash have sought to uncover the role FSD played in von Ohain’s death. The Post reports:

Von Ohain’s widow, Nora Bass, said she has been unable to find a lawyer willing to take his case to court because he was legally intoxicated. Nonetheless, she said, Tesla should take at least some responsibility for her husband’s death.

“Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human,” Bass said. “We were sold a false sense of security.”

In the aftermath of the crash, investigators found that the Tesla continued to feed power to the wheels after impact. They also didn’t identify any signs that Von Ohain nor the car itself had applied the brakes to try and stop the Model 3 as it came off the road. This, Colorado State Patrol Sgt. Robert Madden told the Post, was a clear sign that “fits with the [driver-assistance] feature being engaged.”

Due to the intensity of the fire and the destruction of the car that it cause, Colorado investigators have been unable to access data from the car to determine if FSD really was engaged. What’s more, Tesla said that it “…could not confirm that a driver-assistance system had been in use because it did not receive data over-the-air for this incident,” reports the Post.

See also  ADRO BMW M4 bumper and aero kit is a kidney transplant

Tesla did report the crash to the National Highway Traffic Safety Administration as part of its continued reporting of crashes involving its Autopilot and FSD systems. However, NHTSA could not confirm which program was involved in the crash.