Tesla has become the world’s most successful seller of electric-powered vehicles. Of course the company wants its cars to safely get to their destinations without anyone driving them. That may happen in the future, but for now, the company’s cars are causing accidents, injuring, and killing people. Are all of these incidents happening only because the owners are being careless? Several lawsuits addressing this question have been brought in recent years.
Inattentive Tesla Driver Charged With Manslaughter
In January, a California driver was charged with two counts of vehicular manslaughter after his Tesla ran a red light and collided into another vehicle - killing two people inside, reports the Associated Press. The car was using its Autopilot partial self-driving feature, and the driver, Kevin George Aziz Riad, is accused of not actively driving the vehicle when the crash occurred in 2019.
Prosecutors claim Riad’s Model S traveled at high speed when it left a highway, ran a red light, and struck a Honda Civic. Riad and a passenger in his car were injured and hospitalized after the crash. The accident happened in Gardena, a Los Angeles suburb.
This is the first reported case of a felony (vehicular manslaughter charges) for a fatal crash involving a widely used, semi-automated driving system. Under California law, there are three kinds of manslaughter: voluntary, involuntary, and vehicular.
Vehicular manslaughter is defined in part as, “…driving a vehicle in the commission of a lawful act which might produce death, in an unlawful manner, and with gross negligence.” Riad, 27, pleaded not guilty and was freed on bail while the case is pending. If convicted, Riad faces up to a year in prison on each count.
The families of the two people killed in the Gardena crash, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez, are suing Tesla and Riad. They allege Riad was negligent, and Tesla sold defective vehicles that may suddenly accelerate and lack an effective automatic emergency braking system. A trial is scheduled for next year if the cases don’t settle.
Making Vehicles Partially Self-Driving May Make Them More Dangerous
Tesla’s Autopilot feature can control steering, speed, and braking. There have been numerous accidents during its use. Tesla’s currently under investigation by the National Highway Traffic Safety Administration (NHTSA) due to 11 accidents since 2018 that injured 17 and killed one. It’s estimated about 765,000 Teslas in the US have Autopilot.
NHTSA and the National Transportation Safety Board (NTSB) are reviewing driver misuse of Autopilot. The agencies blame overconfident and inattentive drivers for multiple crashes. NTSB has called the situation “automation complacency.”
The agency investigated a 2018 crash in Culver City where a Tesla with its Autopilot activated hit a firetruck. NTSB found the system “permitted the driver to disengage from the driving task.” No one was injured in that accident. NHTSA has investigated 26 crashes where Autopilot was used, causing 11 deaths since 2016.
Tesla claims that since Autopilot-related fatal crashes started, it’s changed the software to force drivers to be more engaged. The company is testing its more sophisticated “Full Self-Driving” system, which, like Autopilot, cannot safely drive the car by itself. Drivers should pay attention and able to react at any time.
Who’s Responsible for These Accidents?
In accident cases involving Telsa with engaged Autopilot, drivers may be sued for negligence if they weren’t paying attention and mistakenly relied upon their vehicles to safely drive themselves.
Tesla may also be sued for negligence. Plaintiffs may claim Teslas aren’t reasonably safe vehicles if they’re equipped to be operated by drivers only half paying attention. If Autopilot can’t perceive dangerous situations and avoid them, the cars are too dangerous.
Tesla will try to blame drivers and claim they warned them not to misuse Autopilot and remain in control. But that finger-pointing may only get them so far. Product liability law can make those manufacturing and selling a dangerous product pay for the harm they cause without proving negligence.
That usually requires showing a product’s dangerous when used as intended. Tesla can argue Autopilot’s not designed to be used by drivers who aren’t paying attention, but the law also imposes liability when misuse is reasonably foreseeable.
The injured plaintiff would need to show the risk caused by a reasonably foreseeable misuse made the product defective because the risk of that misuse was known, or should’ve been known, by Tesla. Given the accidents, lawsuits, and government investigations into Autopilot-related crashes, it would be hard for Tesla to claim ignorance. Forbes magazine ran an article focusing on a case involving a fatal accident that happened in Japan. Although the suit was eventually dismissed based on improper venue, the article has a lot of interesting information about self-driving cars and manufactures’ liability for crashes.
Get the Help You Need from Lawyers You Can Trust
If you’re injured in a car crash because of another’s negligence, Callaway & Wolf can help. Your initial consultation with an attorney will be free, and we work on a contingency fee basis, so you won’t pay us unless you obtain compensation.
Call us today at 415-541-0300 and schedule a consultation at our San Francisco or Oakland office to discuss your case, how California law may apply, and how we can help you recover the compensation you deserve.