The Tesla Autopilot Crashes Just Keep Coming

The Tesla Autopilot Crashes Just Keep Coming

Picture from a Tesla AP-related crash just before impact into a disabled vehicle:


(Video here on twitter) Tesla autopilot crashes are still happening when drivers (apparently) succumb to automation complacency. It seems they’ve just stopped being news.
The above picture is from a Tesla camera a fraction of a second before impact. (Somehow it seems there was no injury.) The Tesla was said to have initiated AEB (and disabled AP) about two seconds before impact. The video shows clear sightline to the disabled vehicle for at least 5 seconds, but the driver apparently did not react.

Tesla fans can blame the driver all they want — but that won’t stop the next similar crash from happening. Pontificating about personal responsibility and that the driver should have known better won’t change things either. And we’re far, far past the point where “education” is going to move the needle on this issue.

It’s time to get serious about:
– Requiring effective driver monitoring
– Addressing the very real #autonowashing problem that has so many users of these features thinking their cars really drive themselves.
– Requiring vehicle automation features to account for reasonably foreseeable misuse (you might fix the vehicle, or you might fix the driver, or more likely fix both, but casting blame accomplishes nothing)

The deeper issue here is pretending that autopilot-type systems involve humans who think they are driving. The car is driving and the humans are along for the ride, no matter what disclaimers are in the driver manual and/or warnings — unless the vehicle designers can show they have a driver monitoring system and engagement model that provide real-world results.

See also  2024 Graco SlimFit3 LX Review – The Skinny All-in-One Car Seat

The reality is that these are not “driver assistance” systems. They are automated vehicles with a highly problematic approach to safety. This goes for all companies. Tesla is simply the most egregious due to poor driver monitoring quality and scale of deployed fleet. As human-supervised automated driving gets more functionality the safety problem will just keep getting worse.

Source on twitter: https://twitter.com/greentheonly/status/1607475055713214464?ref_src=twsrc%5Etfw from Dec 26th: contains video of impact. No injury apparent to the person in the video, but it was a very close thing. Also a screenshot of the vehicle log showing AEB engaged 2 seconds before impact. https://twitter.com/greentheonly/status/1609271955383029763?ref_src=twsrc%5Etfw
For those saying “but Teslas are safer overall” that statement does not stem from any credible data I’ve ever seen: https://safeautonomy.blogspot.com/2022/12/take-tesla-safety-claims-with-about.html