Who’s to blame when an automated cars crash?

Automated cars on the road

As technology improves and automated cars and self-driving cars are eventually allowed on Britain’s roads, lawyers don’t think drivers should be blamed when things go wrong.

Laws have been drawn up to allow driver-assistance technology to be used in limited ways initially on British roads, such as hands-free driving in vehicles with lane-keeping technology on congested motorways, up to 37mph.

As this could pave the way for fully automated cars in future – and could be a great help to taxi drivers – the Law Commission of England and Wales and the Scottish Law Commission are calling for those in the driver’s seat to be exempt from prosecution if anything goes wrong with the automation and leads to collisions, speeding or ignoring traffic signals and lights.

They argue that the company or software developer behind the technology should be legally responsible for anything that goes wrong while the automated features are being used.

This is different to the current situation in America where although lawsuits have been filed against manufacturers such as Tesla following fatal collisions involving the use of automated technology, several drivers are also being prosecuted with charges including vehicular manslaughter.

While emphasising the features of the technology, Tesla stresses that drivers must remain in control at all times, even when the automated systems are engaged, similar to airline pilots using automatic pilot.

It says on its website: “Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”

See also  How can your businesses prevent counterfeit cash?

While the British legal framework takes a different stance from that in the US, lawmakers over here want clearer marketing around the hi-tech features of cars, especially the difference between self-driving and driver-assistance technology, to avoid confusion about a vehicle’s capabilities.

They want to avoid situations in which a driver doesn’t react to danger because they believe the automated system will brake or steer around an obstacle because that is what they understood from adverts or a sales pitch.

The government also welcomes moves for advances in technology to help make roads safer, but warns the law has to be clear.

Transport minister Trudy Harrison said: “This Government has been encouraging development and deployment of these technologies to understand their benefits. However, we must ensure we have the right regulations in place, based upon safety and accountability, in order to build public confidence.”

It is a warning echoed by the RAC. Head of roads policy Nicholas Lyes said: “While self-driving cars offer the potential to make our roads safer and increase mobility for those who can’t currently drive, it’s vital motorists aren’t lulled into a false sense of security by the way some manufacturers describe their automated technology.

“We believe there’s a big difference between driver-assistance features, such as adaptive cruise control, and genuine self-driving capability, so we support the Law Commission recommending that a clear distinction between the two is drawn when manufacturers market their vehicles.”

Time will tell whether vehicle and software manufacturers will be held legally responsible but – as is the case now – what will remain the driver’s responsibility is properly insuring the vehicle such as private hire taxi insurance or public hire insurance.

See also  Cyber Security Safety for Taxi Firms

They must also make sure passengers are wearing seatbelts and loads, trailers, etc, are secure.

The technology has the potential to reduce accidents and make all our roads safer, but there needs to be clear legislation in place for when things do go wrong.