Autonomous Truck Developer Under Federal Investigation After Highway Crash Prompts Safety Issues

Autonomous Truck Developer Under Federal Investigation After Highway Crash Prompts Safety Issues

Image for article titled Autonomous Truck Developer Under Federal Investigation After Highway Crash Prompts Safety Issues

Screenshot: The Asian Mai Show – Official Trucking Channel via YouTube

In early April, a tractor trailer fitted with autonomous driving technology veered off the road without warning, cutting across the I-10 highway in Tucson, Arizona and slamming into a cement barricade.

According to the Wall Street Journal, the accident report that was made public in June by regulators, shows concern that autonomous trucking company, TuSimple, is risking the public’s safety on roads in order to get its product to the market. That’s according to independent analysts and more than a dozen of the company’s former employees.

Now, the Federal Motor Carrier Safety Administration, an agency within the DOT that regulates trucks and buses, has launched a “safety compliance investigation” into the company. The National Highway Traffic Safety Administration is joining in the investigation, as well.

TuSimple says human error is to blame for the April incident, but autonomous driving specialists say details in the June regulatory disclosure and internal company documents show fundamental problems with the company’s technology.

Video of the accident was posted to a trucking YouTube channel.

Alleged Whistle Blower Shares Raw Video Of Self Driving Semi Truck Crashing Into Median 🤯

An internal document, which was videoed by WSJ, states the truck abruptly veered left because a person in the cab didn’t properly reboot the autonomous driving system before engaging it. That caused the AI to execute a left-turn command that was 2.5 minutes old. If the truck was traveling 65 mph, that command was supposed to take place nearly three miles down the road… which isn’t good. That command should have been erased from the system, but it wasn’t.

See also  Parents Who Provided Car Guilty In Teen Car Crash Death

On its website, TuSimple acknowledged the investigation and said it is taking responsibility to find and resolve safety issues.

Researchers at Carnegie Mellon University dispute that it was all human error. They say common safeguards – like making sure the system can’t respond to commands more than a couple hundredths-of-a-second old or making it so that an improperly-functioning self-driving system can’t be engaged – would have prevented the crash. They also suggest it may be a good idea for the system not to permit an autonomously driven truck from making such a sharp turn while driving at 65 mph.

“This information shows that the testing they are doing on public roads is highly unsafe,” said Phil Koopman, an associate professor at Carnegie Mellon who has contributed to international safety standards for autonomous vehicles, referring to the company’s disclosures.

TuSimple said that after the accident, it modified its autonomous-driving system so that a human can’t engage it unless the computer system is fully functional. A former TuSimple engineer said the move was long overdue. The TuSimple spokesman, in response, said the April accident was the only one in which a company truck was responsible for an accident.

Even though this crash had two people on board, TuSimple is also testing “Ghost Rider” trucks without drivers on public roads. That started back in December of 2021. That was only supposed to happen after 500 practice runs, but it’s reported the company completed less than half of that number before the December drive.

This accident follows years of management pushing back against what some former employees say were big time safety and security issues.

See also  Ecclesiastical Celebrates Jacinta Whyte’s 50 year Career in Commercial Insurance

In late 2021, a group of employees raised some of these issues with the legal department, according to people familiar with the matter. A presentation included the company’s alleged failure to check software regularly for vulnerabilities and use of unencrypted communications to manage trucks, which could provide an opening for hackers to intercept data going between engineers and the vehicles’ systems, the people said.

Safety drivers, meanwhile, have flagged concerns about failures in a mechanism that didn’t always enable them to shut off the self-driving system by turning the steering wheel, a standard safety feature, other people familiar with the matter said. Company management dismissed the safety drivers’ concerns, the people said.

A spokesperson for TuSimple says the company “actively solicits and reviews flags, concerns and risks our employees identify so they can be addressed.”

TuSimple has been a leader in autonomous truck development since it launched in 2015. It’s backed by UPS, U.S. Xpress and Volkswagen.