More news
See more of Lemberg Law’s media coverage
Go to list of articlesBy Peter Holley, January 25, 2018
A California motorcyclist has filed a lawsuit against General Motors, accusing one of the manufacturer’s robot-operated vehicles of “negligent driving.”
Oscar Willhelm Nilsson claims he was traveling down a San Francisco street last month when a Cruise AV aborted a lane change and swerved into his lane. The car struck him, “knocking him to the ground,” in a crash that left him injured and unable to work, according to the suit filed in U.S. District Court in San Francisco this week.
The Cruise AV was operating in self-driving mode at the time of the crash, the suit alleges, but a backup driver was sitting in the front seat of the vehicle with his hands off the wheel.
A San Francisco Police Department report, however, puts the blame on Nilsson. According to the report, Nilsson tried to pass a vehicle before it was safe.
Nilsson’s suit — seeking damages exceeding $75,000 — is one of the first involving an autonomous vehicle. Though manufacturers say self-driving vehicles will dramatically reduce traffic accidents and motor vehicle fatalities, experts are already warning that there will probably be many more accidents involving robot-operated cars, a type of accident that raises unresolved questions about responsibility and restitution.
As of last week, the California Department of Motor Vehicles alone lists 54 autonomous vehicle accident reports dating to 2014, most of them minor accidents in which other drivers were at fault. Two Teslas were involved in separate crashes in California last week. Both drivers claimed that the vehicles were on autopilot, a feature that gives the vehicles semi-autonomy but still requires the driver to be responsible for the car.
A Model S driver who ignored numerous warnings inside his Tesla was killed in an accident in 2016. A National Transportation Safety Board report said the driver was overly reliant on the vehicle’s autopilot before his fatal crash.
But in future wrecks involving automated vehicles, experts say, manufacturers could see their liability increase.
“When crashes occur it’s much less likely that there’ll be a human to blame in a lot of instances and it’ll be much more likely that it can be argued that the automated system can do better,” said Bryant Walker Smith, a law professor at the University of South Carolina and an expert on the law of self-driving cars.
Smith said investigators will likely determine fault by asking how humans would have performed in a similar circumstance and whether how comparable the automated system performed around the same time.
“Data will be crucial to these cases,” Smith added. “Investigations will increasingly turn on digital data stored locally or remotely — from the vehicles involved, other vehicles, personal devices, and surrounding infrastructure. Sometimes these data will provide certainty [allowing investigators to “replay” a crash] and sometimes they will actually introduce new uncertainty.”
In a report filed with the California DMV, GM disputed Nilsson’s account of the crash.
“As the Cruise AV was re-centering itself in the lane, a motorcycle that had just lane-split between two vehicles in the center and right lanes moved into the center lane, glanced the side of the Cruise AV, wobbled and fell over,” the report claims, noting that the Cruise AV was traveling with the flow of traffic at 12 mph and the motorcycle was traveling at approximately 17 mph.
The report states that crash resulted in a “long scuff on passenger side of the vehicle.”
“The motorcyclist was determined to be at fault for attempting to overtake and pass another vehicle on the right under conditions that did not permit that movement in safety,” the report says, adding that Nilsson claimed he had shoulder pain.
Reached by email, a GM spokesperson said safety is the company’s “primary focus” during the development and testing of self-driving technology.
“In this matter,” the spokesperson added, “the SFPD collision report stated that the motorcyclist merged into our lane before it was safe to do so.”
The suit arrives on the heels of GM’s announcement that the company has submitted a petition to the Department of Transportation this month asking for permission to deploy self-driving Cruise AV’s. A fourth-generation autonomous vehicle based on the Chevy Bolt EV, the vehicles lack a steering wheel, pedals and human backup drivers.
The manufacturer is touting the vehicle as the world’s “first production-ready vehicle” built with the sole purpose of operating “safely on its own with no driver.”
GM — already testing driverless vehicles in San Francisco and Phoenix — is one of several companies testing level 4 vehicles — those that are autonomous to a degree. A California-based autonomous vehicle start-up called Zoox and Alphabet’s Waymo have also tested level 4 cars.
Nilsson’s suit seeks unspecified damages, as well as the cost of attorney’s fees and punitive damages. Reached by email, Nilsson’s lawyer, Sergei Lemberg, claims that the police report actually supports his client’s claim, not GM’s.
He said the police report states that the AV driver saw Nilsson before the crash but didn’t have enough time to grab the wheel and swerve.
“Absolutely we dispute it,” he said, referring to GM’s side of the story. “As far as the technology is concerned, I’m troubled that GM shifted the blame to my client. The maneuver by the autonomous car was unpredictable and dangerous.”
Original story: At the Washington Post