A self-driving Tesla rear-ended a fire truck in California. An automated-vehicle being tested by Uber hit and killed a pedestrian in Arizona. In Florida, a test driver for Tesla was killed in a car accident. The more companies test their automated or self-driving vehicles on the roads, the more we’ll hear about accidents and fatalities.
There are a lot of unanswered questions about automated vehicles including who exactly is responsible for a collision when it occurs. Like many new technological advances as they meet with current day reality, the answer is complicated.
Drivers Using Autopilot Mode
In automated vehicles currently on the road, a driver is still in the vehicle. They have the ability to take over the system if there’s a problem. In the case of the Tesla accident in Florida, it seems that the car warned the driver to disengage auto-pilot mode to take control of the vehicle, and he either didn’t or didn’t do it in time.
Automated vehicles are also equipped with Automated Emergency Braking (AEB) systems. If the car senses an impending collision and the driver does not react in time, the car is supposed to brake on its own. In some cases, the system doesn’t work until the driver touches the brake pedal.
This leaves room for plenty of human error. In order for a vehicle to truly be driverless, it must also be completely safe with all safety features able to react better than we can.
The Manufacturer Makes the Vehicle
Self-driving car accidents could implicate the manufacturers more than the driver or the victim. In the Uber fatality in Arizona, a homeless woman walked out from the shadows, off the median, to cross the street. The sensors in the self-driving vehicle didn’t detect her in time. If the accident was caused by a faulty sensor, it’s possible that the responsibility could be product liability.
Product liability states that manufacturers are responsible when defects or errors in their product cause damage, harm, or injury. But preliminary investigation of the accident found that the sensors and vehicle worked as programmed.
As automated vehicles begin to go more mainstream — over time — companies like Uber who pick up fares with driverless cars could be held to the “common care doctrine.” This states that bus and taxi companies, hotels, even insurers have a responsibility as a provider to a higher standard of care for customers and clients. Does this mean that accidents in a driverless Uber would be automatically considered the fault of the manufacturer or company?
The Law Isn’t Clear
Right now, the law isn’t clear on exactly who’s responsible. As most of the automated vehicles involved in collisions have been in test phases, there have been few cases making their way through the court systems. Even with a 2018 lawsuit filed by a bicyclist hit by an automated vehicle against General Motors after an accident involving the Bolt, there is still no clear answer. The police report states that the person hit may have been at fault, though that did not stop the lawsuit from moving forward.
Like we said before, determining who’s responsible for automated vehicle accidents is complicated. If the vehicle performs the way it was made, it may be difficult to hold a manufacturer liable. If the accident victim’s own actions played a part, that could reduce anyone else’s liability.
The law hasn’t caught up to the idea of self-driving cars just yet, so it may be awhile before anyone can safely say who’s responsible for an accident. Until then, it’s best that all drivers have plenty of auto insurance. Need to update your current policy? Looking for a quote to save money on your auto insurance? Contact Charlotte Insurance today!