Teslas Are Braking for No Reason, But That's Not Autopilot's Only Problem

Teslas Are Braking for No Reason, But That's Not Autopilot's Only Problem

An aerial view of Teslas parked at the company’s Fremont, CA facility

An aerial view of Teslas parked at the company’s Fremont, CA facilityPhoto: Justin Sullivan (Getty Images)

The National Highway Traffic Safety Administration (NHTSA) has released a damning report on Tesla’s Level 2 driver assist systems, called Autopilot and Full Self-Driving. Over 750 Tesla owners have reported their vehicles have mysteriously stopped on roadways for no clear reason. While that should be a concern for Tesla, it’s also far from the only safety problem the automaker’s semi-autonomous technology has faced.

Since introducing features with names like “Autopilot” and “Full-Self Driving,” Tesla has faced criticism for overstating the capabilities of what are still merely driver-assist systems that still require constant vigilance from the person behind the wheel. Linguistic concerns are only part of the problem, though; the very basis of the technology has been riddled with faults that are being discovered by ordinary consumers beta-testing the software on public roads.

In this most recent phantom braking issue, NHTSA has requested more information from Tesla about the 750 complaints. From the Associated Press:

In the letter, NHTSA asks for the initial speed of when the cars began to brake, the final speed, and the average deceleration. It also asks if the automated systems detected a target obstacle, and whether Tesla has video of the braking incidents.

The agency is now seeking information on warranty claims for phantom braking including the owners’ names and what repairs were made. It’s also seeking information on Tesla’s sensors, any testing or investigations into the braking problems, or if any modifications were made.

See also  A New SUV Is Worse For The Planet Than A Ten-Year-Old Car

The letter focuses on Tesla’s testing of the automated systems when it comes to detecting metal bridges, s-shaped curves, oncoming and cross traffic, and different sizes of vehicles including large trucks. The agency also wants information on how cameras deal with reflections, shadows, glare and blockage due to snow or heavy rain.

In 2017, Autopilot steered a man into a concrete barrier at 70 mph, a fatal accident; the driver, it was found, had been using his cell phone and may not have noticed that his Tesla had taken a sharp turn. However, the National Transportation Safety Board found that, in this case, it was likely Tesla Autopilot had not even been programmed to recognize concrete barriers and therefore wouldn’t be programmed to stop for one.

The inability to recognize certain objects has resulted in deaths of two drivers whose vehicles didn’t know to stop for tractor trailers. Teslas also didn’t know to stop for emergency vehicles that may have been parked on the side of the road or in a lane of traffic, which resulted in at least 12 reported accidents. When Full Self-Driving Beta was released, the quality of a Tesla’s left turns decreased as users continued to test the software; the cars also aimed at packed lanes and scraped against bushes. Consumer Reports even compared FSD Beta to a drunk driver.

Of course, we can’t ignore the human component in these situation; had the drivers been paying attention, they likely would have realized the beginning of a dangerous situation and been able to make evasive maneuvers to prevent a crash. After all, drivers are technically supposed to have their hands on the wheel and their butts in a seat in order to engage Tesla’s driver-assist software.

See also  Gas chain Sheetz offers E85 for $1.85/gallon through April

But as Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles, told CBS News: “It’s very easy to bypass the steering pressure thing. It’s been going on since 2014. We have been discussing this for a long time now.” We at Jalopnik covered all sorts of ways a driver could add steering wheel pressure without actually having their hands on the wheel. And that pressure sensor was only added after Tesla was called out for it; the company initially avoided installing one to save money.

Whatever NHTSA’s finds in this new phantom braking case, the very fact that Tesla’s semi-autonomous driver-assist systems are consistently facing so much scrutiny should be a red flag to the automaker itself, consumers, other drivers and regulatory bodies. It should also raise important questions as we continue on this autonomous vehicle trend: How much testing is required before a semi-autonomous vehicle hits the road? How many regulations should be required to guarantee the safety of these technologies? And why are we using conventional drivers as beta testers for software that perplex everyone from engineers to ethicists?