Tragedy halted the autonomous vehicle (AV) industry this week, when news broke of the first pedestrian to be killed by a self-driving car. Elaine Herzberg, 49, was walking her bicycle across the road on Sunday night when she was struck by a Volvo that Uber had been testing in autonomous driving mode. While fault for Sunday's incident has not yet been determined, Uber immediately withdrew their AV fleet from the roads. On Wednesday, Toyota did the same.
Several car manufacturers and tech companies—such as Tesla, GM and Alphabet—are testing self-driving vehicles with the hopes of automating the trucking and ridesharing industries. Not only could automation reduce company overhead by eliminating the need for drivers, but autonomous vehicles are safer, too. According to the Virginia Tech Transportation Institute, autonomous cars get into only 3.2 crashes per million miles driven—nearly 24% less than the national average of 4.2. And since human error is responsible for a vast majority of accidents, this number could decrease even more as broader adoption of autonomous vehicles takes place.
Today, however, most autonomous vehicles still require a human driver to sit in the driver seat, should they need to intervene. This was the case with Sunday evening's incident.
The presence of a human driver complicates the question of liability in AV incidents. Who might be to blame when a self-driving car gets into a collision? The vehicle manufacturer? The AV software developer? The person behind the wheel?
These questions put us into uncharted waters, legally. How regulators choose to answer them could be a boon or road block to AV development, and it could have massive implications for the auto insurance industry.
The dual responsibility of self-driving cars
In accidents that involve a self-driving car, determining which party is at fault would work the same way that it does with any other collision. Did one vehicle have the right-of-way? Was a traffic law violated? Is fault for the accident shared among multiple involved parties? The facts of the incident will determine which vehicle caused the accident.
However, in AV incidents, whether the driver of that vehicle or the car manufacturer is liable is a separate issue. In commercial situations—such as Sunday's Uber collision—the company's insurance policy would typically accept liability—assuming the company's driver or vehicle was determined to be at fault. But in non-commercial situations, courts and insurance companies alike will seek further information. Was the autonomous driving mode activated in an appropriate location? Did a defect in the software or physical manufacturing lead to the collision? Could the driver have been reasonably expected to intervene?
Currently, the default blame goes to the driver
The status quo tends to implicate the human driver of the at-fault vehicle. "Fault lies with the party that set in motion the chain of events that made the event inevitable," says James Lynch, chief actuary at the Insurance Information Institute (III). "The owner or driver of the vehicle is presumed to be at fault unless there is something systematically wrong with the vehicle."
That legal tendency doesn't mean that the driver is automatically condemned in a situation where a software defect is truly to blame. As the use of autonomous driving features grows, some liability will likely shift either to the vehicle manufacturer or supplier. However, proving liability may be difficult—especially, Lynch adds, in a situation where two AVs collide. Attorneys could call into question the drivers, manufacturers, software developers, suppliers, and more until a settlement is reached. A driver could also still be held responsible if, for example, they failed to download a software upgrade.
At the very least, the spread of autonomous cars from a manufacturer's trial runs to a mainstream automotive option will create a lot of work for liability lawyers. More broadly, the ascent of autonomous cars will challenge the insurance industry to develop new policies and practices altogether. Insurance laws will need to account for the dual liability created when a driver cedes a large portion of the responsibility to drive safely to the car itself—and to its programmers. Additionally, regulators will have to adapt driving codes to account for criminal liability in a situation where an autonomous vehicle commits a driving error.
In the meantime, however, drivers are at very little risk of getting into a collision with an autonomous vehicle. And as Uber, Toyota and other companies continue to run tests, they might choose to accept liability, rather than endure the public relations headache of fighting in court over responsibility with individual drivers.