Who is Liable When a Self-Driving Car Gets into an Accident?

By  | 

Self-driving cars might soon become a common part of our lives–TechCrunch recently reported a study predicting that 25 percent of driving could be done by autonomous vehicles by the year 2030.

But the new technology is far from perfect, and has recently caused a number of incidents. In late March, a Tesla in autopilot mode hit a police motorcycle in Arizona, although the officer was able to jump off of the motorcycle before he was injured. And a self-driving Volvo being used by Uber ended up in another crash in Arizona around the same time. Over the summer, a Tesla driver died in a crash while the car was on autopilot (the driver was watching a Harry Potter movie at the time).  In September, one of Google’s self-driving vehicles ran a red light and collided with the passenger’s side of another vehicle.

So, when there isn’t a human steering the wheel, who takes the fall in court for accidents like these? Many lawyers and legal experts who have weighed in on the issue believe that the automobile manufacturers should be held liable.

However, not every case involving self-driving car crashes is the same. Recently, Michael I. Krauss, a professor at George Mason University’s Antonin Scalia Law School, explored how different types of accidents and malfunctions for different types of vehicles should be handled under tort law in a piece for Forbes Magazine. Tort law involves civil cases in which one party has faced injury or damages and another party has been accused of being responsible for them.

According to Krauss, if an accident occurs because of a “manufacturing defect”–meaning the car does not operate as it was designed to operate–then the company that built it should be at fault. If there was an “informational defect”–meaning the car’s owner was not properly educated about how to operate it, and used it incorrectly as a result–then, Krauss writes, the car company should be liable only if it was negligent and failed to give sufficient instructions or warnings.

However, Krauss notes that “design defects” create a legal gray area. A design defect would occur if the choices the car has been programmed to lead the driver into an accident in response to an unforeseen issue. For example, Krauss says that if a moose jumps in front of the car, it could choose to hit the moose and potentially kill the driver or swerve onto the sidewalk and endanger pedestrians. He argues that decisions about liability in these scenarios should once again be based on whether or not the manufacturers were negligent or whether they made the best possible design choice. Such decisions could be left up to juries or decided beforehand by regulators, based on what a reasonable person might conclude, Krauss writes.

The Society of Auto Engineers has established six levels of driving automation, with level zero indicating that the driver has full control and level five indicating that the car is completely autonomous. Bryant Walker Smith, a law professor at the University of South Carolina, told USA Today that a human driver is responsible for any crashes involving a vehicle ranked lower than level three. Smith added that because most accidents are caused by human error, which automatic vehicles aim to eliminate, a growing reliance on self-driving cars could mean fewer accidents and thus fewer legal disputes.

But the technology isn’t perfect, and can still make the same mistakes as humans–like speeding or running through red lights. Questions about who would take the blame for these violations remain unanswered.

Government regulation of self-driving cars could be changing under President Donald Trump’s White House. In September, former President Barack Obama’s administration released a set of standards for self-driving car manufacturers that would require them to conduct extensive safety assessments and provide the results to the federal government. Because legislation that addresses the vehicles varies for each state, the Department of Transportation released a centralized list of guidelines each state could adopt. But Elaine Chao, the new transportation secretary in Trump’s Administration, is now reevaluating the old administration’s rules as companies that develop the vehicles like Google and Uber push back against the amount of information they would have to report. Chao has cited safety and jobs–because the technology would eliminate the need for occupations like truck drivers–as her main concerns as she considers the issue.

Victoria Sheridan
Victoria is an editorial intern at Law Street. She is a senior journalism major and French minor at George Washington University. She’s also an editor at GW’s student newspaper, The Hatchet. In her free time, she is either traveling or planning her next trip abroad. Contact Victoria at



Send this to friend