Self-Driving Car Users Shouldn’t Be Held Responsible For Crashes, U.K. Report Says

Topline

Self-driving car users would be given immunity from offenses like dangerous driving, breaking speed limits and running red lights under legal reforms proposed by the Law Commission of England and Wales and the Scottish Law Commission.

Key Facts

In a joint report published Tuesday, the commissions recommended that legal responsibility for accidents caused by self-driving vehicles should rest not with the person in the driver’s seat, but with the company or body that obtained authorization for the self-driving features used by the vehicle.

U.K. Transport Minister Trudy Harrison said she hoped the report would build public confidence in self-driving technology that could make everyday travel safer and more environmentally friendly.

Technology currently available to consumers falls short of full autonomous driving and, in the U.S., legal liability for violations by vehicles using automated driving technology rests with the driver.

Thursday, the Insurance Institute for Highway Safety announced plans to release safety ratings for vehicles that use partial automation—a technology that can be dangerous without adequate safeguards, IIHS President David Harkey said.

In October, Kevin George Aziz Riad, 27, was charged with two felony counts of vehicular manslaughter with gross negligence in connection with the the deadly 2019 wreck in Gardena, California of a 2016 Tesla Model S that used an autopilot feature, the first instance in the U.S. of a driver prosecuted for a felony while using automated driving technology.

Param Sharma, 25, was charged with two counts of reckless driving in May after setting a Tesla to drive him around the San Francisco Bay Area on autopilot while he relaxed in the back seat, which he described as a “magical experience,” NBC reported.

Key Background

Tesla, GM, Google and other auto manufacturers and tech companies have invested billions in developing driverless vehicle technology. However, there are currently no vehicles on roadways that are truly driverless, with the current generation of commercially available systems capable of doing little more than keeping vehicles in marked lanes and automatically braking when a hazard is detected. Even if driverless technology does arrive, human intervention will still be necessary to service the vehicles and prevent illegal activity, MIT researcher Ashley Nunes told the Financial Times. A legal definition of “self-driving” technology has not yet been established in the United Kingdom, and the legal reforms proposed by the Law Commission of England and Wales and the Scottish Law Commission may only apply to technology that does not yet exist.

Crucial Quote

“We have an unprecedented opportunity to promote public acceptance of automated vehicles with our recommendations on safety assurance and clarify legal liability,” said Public Law Commissioner Nicholas Paines. “We can also make sure accessibility, especially for older and disabled people, is prioritized from the outset.”

Contra

Under the proposed legal reforms, self-driving car users would still be responsible for driver duties such as carrying auto insurance and making sure children wear seatbelts.

What To Watch For

Authorities will decide whether to introduce legislation to put the commissions’ recommendations into effect.

Surprising Fact

In 2015, Tesla CEO Elon Musk predicted that self-driving cars with unlimited mobility would be available by 2018 at the latest.

Further Reading

“Self-Driving Car Users Should Not Be Responsible ‘If Anything Goes Wrong’: Report” (Bloomberg)

Source: https://www.forbes.com/sites/zacharysmith/2022/01/25/self-driving-car-users-shouldnt-be-held-responsible-for-crashes-uk-report-says/