Autopilot’s Reliance On Only Cameras Is Tesla’s ‘Fundamental Flaw’
So far this year, Tesla cars equipped with Autopilot and Full-Self Drive software have been caught hitting parked police cars, clipping trains and veering off the road. Now, a report has revealed that the crashes are the result of a “fundamental flaw” in the way the software works.
The Hype Behind Tesla Stock Success In 2023
Tesla’s Autopilot system works through an array of cameras positioned across its cars, including on the front, rear and sides. The cameras constantly survey the area around the cars and, using machine learning, calculate how the vehicle should behave when Autopilot or Full-Self drive is engaged.
The cameras are trained by data experts at Tesla to spot road signs and respond accordingly. They’re also programmed to recognize obstacles in the road, like stopped trucks or animals. However, they only “know” how to respond to an obstacle if it’s something that the software has been trained to spot – if not, then the cars won’t know how to respond.
This shortcoming has been outlined in a new report from the Wall Street Journal, which investigated more than 200 crashes involving Tesla cars equipped with Autopilot and FSD.
In the report, which is available to watch here, the outlet uncovered hours of footage of crashes involving Teslas. Of more than 1,000 crashes that were submitted to the National Highway Traffic Safety Administration by Tesla, the Journal was able to piece together 222 incidents to analyze. The 222 crashes included 44 that were caused by Tesla cars that veered suddenly and 31 occurred when cars failed to stop or yield.
“The kind of things that tend to go wrong with these systems are things like it was not trained on the pictures of an over-turned double trailer – it just didn’t know what it was,” Phil Koopman, associate professor of electrical and computer engineering at Carnegie Mellon University, told the WSJ.
Tesla boss Elon Musk is adamant that Autopilot will save lives. Photo: Nora Tam/South China Morning Post (Getty Images)
“A person would have clearly said ‘something big is in the middle of the road,’ but the way machine learning works is it trains on a bunch of examples. If it encounters something it doesn’t have a bunch of examples for, it may have no idea what’s going on.”
This, the Journal says, is the “fundamental flaw” in Tesla’s Autopilot technology and its Full-Self Drive software. According to the WSJ:
Tesla’s heavy reliance on cameras for its autopilot technology, which differs from the rest of the industry, is putting the public at risk.
Teslas operating in Autopilot have been involved in hundreds of crashes across U.S. roads and highways since 2016. Over the years, Tesla CEO Elon Musk has maintained that the technology is safe.
However, external experts aren’t so sure, and the WSJ spoke with Missy Cummings, director of Mason’s Autonomy and Robotics Center at George Mason University, who has repeatedly warned that people could die behind the wheel of Teslas operating FSD and Autopilot.
“I am besieged with requests from families of people who have been killed in Tesla crashes,” Cummings told the WSJ. “It’s really tough to explain to them that, you know, this is the way the tech was designed.”
Instead of relying heavily on cameras and computer vision, other automakers add more complex sensors to their self-driving prototypes. At companies like Volvo, lidar and radar are employed to scan the road ahead. These sensors survey the road using sound waves and lasers to get a clearer view of the path even in foggy, dark or other conditions in which a camera isn’t as effective.
Tesla’s self-driving rivals Volvo use lidar in their systems. Image: Volvo
The additional expense of such systems have been a big factor in Tesla’s decision not to use them, with company boss Elon Musk once referring to them as “unnecessary” and like fitting the car with a “whole bunch of expensive appendices.”
The value this added cost brings shouldn’t be underestimated, however. At Mercedes, the inclusion of lidar, radar and 3D cameras has paved the way for its self-driving systems to roll out onto America’s roads. In fact, the automaker became the first company to get the green light for level three autonomy last year with its Drive Pilot system in California and Nevada.
The level three system is a step ahead of what Tesla can offer, and Mercedes also goes so far as to take full legal liability when Drive Pilot is activated. Imagine if Tesla did that for cars operating Autopilot and FSD.