Autonomous Vehicle Accidents: Who Is Held Liable?
Distracted driving kills 3,400+ people in the United States annually. Over a million accidents occur every year, and autonomous vehicles will, hopefully, stop many of these accidents, saving tens of thousands of people per year.
But what happens when an autonomous vehicle is involved in an accident?
Who is liable for the accident?
“When it comes to autonomous vehicles, the existence of an anomaly may be more difficult to identify given the catastrophic nature of automobile accidents. It may be difficult for a plaintiff who’s been injured by an autonomous vehicle to prove the existence of a manufacturing flaw if the evidence is destroyed at the scene,” writes Ankin Law Office LLC.
The vehicle, completely operated via artificial intelligence, can’t be personally sued.
Uber’s autonomous driving vehicle killed a person walking her bicycle across the road, and this led to the ridesharing company pulling all of their testing vehicles off of the road. Toyota followed, removing their vehicles from the road, too.
Autonomous vehicles have a much lower, 24% to be exact, accident rate per million miles driven when compared to the national average. Human error is responsible for the majority of accidents involving autonomous vehicles, so this number is going to be much lower as more autonomous vehicles are on the road.
Today’s autonomous vehicles are not 100% autonomous. Instead, a driver is still behind the wheel, and they are there to intervene if the vehicle makes the wrong decision when driving. If the driver tasked with operating the vehicle fails to make the correction, then this person’s negligence will hold them liable for their failure to act.
Of course, in the case of Uber, the driver was an employee, and Uber is likely to be held liable.
There’s also the question of whether the driver may have acted, but there was a defect and the vehicle didn’t respond.
Manufacturer vs Driver Fault
Manufacturers of a vehicle may be at fault for an accident if a defect was found. Sensors or AI defects may be difficult to prove because, in the event of an accident, it’s possible that these mechanisms were damaged.
Insurance companies will want further proof of how the accident occurred, and the autonomous vehicle’s data may be stored and analyzed further to determine if the vehicle made the appropriate corrections.
Software or physical defects may lead to manufacturers being at fault.
But then there’s the question of whether the driver intervened expectedly. An opposing driver may also be at fault for the accident. If the opposing driver set events in motion due to their actions, this may be what resulted in the accident.
Let’s assume that the opposing driver, not in an autonomous vehicle, veered into the autonomous vehicle’s lane and caused events that led to an accident. If the vehicle, going at the appropriate speed, reacted properly, it’s safe to assume that the opposing driver was at fault. Perhaps the opposing driver clipped the front of the autonomous vehicle.
In this case, the driver of the opposing vehicle would be at fault in the accident.