The car industry has been flooded with news releases regarding the development of self-driving vehicles. Heavy hitters like Google and Tesla have already made their mark in this particular niche and other major car companies are following suit. However, with new developments, testing and product issues, many questions will arise that can weigh heavily on whether the product will sink or swim.
One major question that has arisen is “who is responsible for an autonomous vehicle when it is involved in a car accident?” This is a good question.
For drivers who are involved in an accident with another driver, the fault is normally clear. An accident report is submitted and insurance companies (fingers crossed) will be responsible to pay for the damages. If the situation becomes sticky, which it often does; personal injury attorneys (like Kaine Law) will step in and assist individuals who were hurt or killed in an accident.
For a free legal consultation, call 404-214-2001
However, a driverless car that is solely reliant on technology can create a number of problems if, or more likely when, it is involved in an accident. An article published by The Atlantic poses a series of questions that could change the course of whether driverless cars are feasible.
The topic of liability then becomes the main focus because those involved in the accident must determine who is responsible. It may seem slightly impossible for our court systems to deal with this new, innovative and complex issue. However, some people argue that both the car industry and our legal system are prepared to take on this great task. In fact, it looks like they won’t have a choice.
Click to contact our personal injury lawyers today
In the article, it states that product liability has been evolving in the area of law. It also stresses how plaintiffs in products-liability lawsuits have chosen various “theories” of liability when seeking damages. And lastly, the topic of negligence also plays an important role when manufacturers fail to design a product safely when being used in “reasonable” foreseeable ways. An example used in the article is an automated braking system that works perfectly during the day, but fails at night in the dark. This leads to fender benders and even more serious car accidents. A person who is hit can claim that nighttime driving for the self-driving car was a reasonably foreseeable use of the vehicle–therefore the manufacturer failed to ensure that the technology was working properly.
Complete a Free Case Evaluation form now
This discussion, and others like it, will be flooding our newsfeeds as more and more autonomous vehicles are tested and eventually used by consumers. The responsibility factor, personal injury liability cases and questions surrounding negligence are soon to follow. In the meantime, instead of being frustrated by other drivers on the road, perhaps we should value the person next to us as soon we may be passed, or even hit, by the autonomous self-driving vehicle.
For more information on this topic, contact Kaine Law.
Call or text 404-214-2001 or complete a Free Case Evaluation form