Skip to content
Podcast

Can Laws Reverse Public Mistrust of AV?

It’s past time to clarify the liability issues of highly automated vehicles.
William Widen and Phil Koopman
William Widen (Left) and Phil Koopman

Share This Post:

By Junko Yoshida

Guests:
William H. Widen, Professor, University of Miami School of Law
Philip Koopman, associate professor of electrical and computer engineering,  Carnegie Mellon University

Robotaxi operations in San Francisco today present the complete disarray of politics, technology, regulation and business interests surrounding fully autonomous vehicles (AVs).

Evoking the streets of San Francisco, auto-safety expert Phil Koopman—in a podcast chat with Junko and Bola at the Ojo-Yoshida Report—said that public confidence in AVs is on a “downhill slide.”

If the society is serious about regaining people’s trust in AVs, the needed fixes are numerous. First might be lowering the level of decibels and desperation in AV marketing messages. Next? Improved technological and operational implementations of AVs, and a substantial effort to repair community relations with the city of San Francisco and its first responders. Finally, there must be much more transparency in sharing collected data with independent safety assessors.

Less understood by the public, frequently dismissed by the AV industry, and often missing from the public discourse are issues of liability and regulatory reform.

This episode of “Chat with Junko and Bola” zeroes in on the legal issues of highly automated vehicles with guests Bill Widen and Phil Koopman.

Listen to podcast here:

Click here to watch Video Podcast

Law is ‘so broken’
First, acknowledge, as Koopman stated, that “the law is so broken today” that it cannot deal with vehicles that drive themselves.

For instance, when the car – not the person — breaks a traffic law, a city traffic cop “can’t even write tickets, because a ticket for a moving violation requires the name of a driver,” said Koopman. “Should it be the CEO of the AV company?” At the moment., no one has an answer.

An even more urgent issue is liability. “We are not really sure who is responsible for losses from accidents that are caused by autonomous vehicles,” pointed out William Widen.

Good luck with proving a ‘product defect’
A lay person might regard this matter as something to be ultimately sorted out in court. This is thinking pretty much in line with the auto industry, whose leaders insist there is no need for legal reforms. They say that existing rules are up to the job of allocating all kinds of liability. Widen disagrees..

To prove a design defect, or a product defect in the machine learning system would put any person who is a victim of an ordinary automobile crash at a big disadvantage.

William Widen

Think about the technology in highly autonomous vehicles, said Widen. “To prove a design defect, or a product defect in automated driving system,” said Widen, is almost too complicated to contemplate. Time and expense needed to prove defect in just the machine learning system would “put any person who is a victim of an ordinary automobile crash at a big disadvantage.”

Koopman simplified the argument further by suggesting, “Suppose a vehicle runs a red light,” causing a collision.

“If a car runs a red light, you don’t really want to hire someone… spending hundreds of thousands, more likely millions of dollars,” he said, to get hold of all the source code, followed by flying experts who spend weeks, hunting for a neuron in the neural network that caused the accident, in a locked secure room with a guard outside. “I mean,” concluded Koopman, “good luck with that.”

He added, “The reality is that the car ran a red light and hit something, or it failed to yield to a firetruck. Why are we looking at the source code or the neural net training data when it ran a red light?”

Introduction of a ‘Computer Driver’
Both Widen and Koopman suggest legislatures should amend state laws to accept the concept of a “Computer Driver” for highly automated vehicles.

The creation of this hypothetical concept makes the Computer Driver responsible for driving like a human at the wheel, explained Widen. Any time the Computer Driver falls short of the duty of care typically imposed on a human driver, the machine would be liable for negligence.

Because the Computer Driver can’t go to jail or pay a fine, “the manufacturer is the responsible party,” said Koopman. The carmaker, who gave birth to the AV, is “the parent of the Computer Driver.”

Treating the Computer Driver as a human driver’s equal is an elegant solution,  noted Koopman, because it doesn’t require the rest of the existing legal framework to change.

Mercedes-Benz ‘stands behind our product’
Asked about liability issues affecting its new Level 3 vehicle, Mercedes-Benz said that as long as consumer drivers don’t misuse the product, “we stand behind it.”

Most consumers don’t understand that product liability has nothing to do with negligence.

Phil Koopman

Note that Mercedes-Benz is talking about “product” liability.

“Most consumers don’t understand that product liability has nothing to do with negligence,” stressed Koopman. They don’t understand how painstaking, painful and almost impossible it is to prove product liability.

By amending state laws to expressly acknowledge the possibility of an ordinary negligence claim against a Computer Driver, Widen and Koopman believe the law can hold “the manufacturer of the Computer Driver responsible for losses proximately caused by negligent computer driving.”

Widen believes that “passing a statue that sets forth clear lines of responsibility” is the simplest solution to restore the AV credibility.

Such regulatory action doesn’t solve every safety problem, acknowledged Koopman, but it’s “a safety net” that “gets you in the right ballpark.”

Widen said, “When I was testifying before a Washington State Commission, I said [to the AV industry], ‘Look, I don’t like regulation. I’d like to like AVs. If you get really clear liability rules, I would be very prepared to give [AV] companies near carte blanche to deploy these things, because the risk of them making payouts would motivate safety in a way that this lack of clarity does not.”

In short, “I would trust the legal system to create the proper financial incentives,” he added.

Copyright permission/reprint service of a full Ojo-Yoshida Report story is available for promotional use on your website, marketing materials and social media promotions. Please send us an email at talktous@ojoyoshidareport.com for details.

Share This Post: