Popular Science: A Motorcyclist is Suing GM After Crashing into its Self-driving Car

A Motorcyclist is Suing GM After Crashing into its Self-driving Car

By Rob Verger, January 26, 2018 


It was morning in San Francisco on December 7 of last year, and a self-driving car and a motorcyclist were both motoring down Oak Street. The autonomous car, a white 2016 Chevy Bolt, started to make a lane change to the left from the center lane, then aborted it—the gap it was moving into shrank. When it slid back into the center lane of the three-lane street, it collided with a motorcycle that had been passing on the right. The motorcycle and its rider fell to the ground.

The accident triggered a lawsuit for over $75,000 against General Motors, which owns the automation company Cruise; the 27-year-old plaintiff, Oscar Willhelm Nilsson, went on disability leave because, according to the suit, he “suffered injuries to his neck and shoulder and will require lengthy treatment.”

Welcome to the fascinating world of accidents involving vehicles that drive themselves and make decisions on their own.

The legal system will now need to determine where any blame lies. In a report filed by GM to California’s DMV, the company states, referencing a traffic report, that “the motorcyclist was determined to be at fault for attempting to overtake and pass another vehicle on the right…” Traffic at the time was “heavy,” and the company says that the biker had been lane-splitting, which is legal in the state and involves a rider cruising between two lanes of traffic.

According to the police traffic collision report, a copy of which was obtained by Popular Science, the driver of the Cruise said that “he attempted to take control of the self-driving vehicle by grabbing the wheel, but simultaneously collided with [the motorcycle].” The weather was clear, the roadway, dry. The fire department took Nilsson to the hospital.

“Safety is our primary focus when it comes to developing and testing our self-driving technology,” a GM spokesperson said in an emailed statement. “In this matter, the SFPD collision report stated that the motorcyclist merged into our lane before it was safe to do so.”

Reached by phone, Sergei Lemberg, the lawyer representing Nilsson, said that “our position is that GM is 100% responsible for this accident” because their car hit Nilsson, and referred to the automated vehicle’s action as “unpredictable and dangerous.”

How should we judge autonomous cars when they crash?

Self-driving vehicles are generally known as conservative drivers focused more on safety than getting anywhere fast.

“Today, the vast majority of crashes are caused at least in part by human error, and as more and more driving decisions are shifted from human to machine, more and more crashes will hopefully be prevented,” says Bryant Walker Smith, an assistant professor at the University of South Carolina who studies the intersection of technology and the law and focuses on autonomous vehicles. “But those that aren’t will increasingly be explained, at least in part, by some kind of computer error.”

When speaking about drivers making errors, one commonly used word is “negligence,” says Smith, and indeed, the lawsuit alleges that the car acted in a “negligent manner.” But with products, the key word is “defect.”

And things gets really interesting when thinking about what it means for a product, like an autonomous car, to have a defect. A new phenomenon, Smith says, is that instead of thinking about a defect with the car itself—like a faulty airbag—now a defect could potentially be found in “the decisions that the vehicle makes.”

Generally, there are two different paths for trying to figure out if an automated car was defective in any way, Smith says. One method would be to compare the computer’s driving to a person, and ask if a human could have done a better job handling the situation. And if that’s the case, it’s probable that the automated system could be said to be defective. Another method would be to compare the self-driving car in question to the tech that other self-driving vehicles on the road have at the time, and in that case, as automated technology improves, the bar for safety gets higher. (Both methods will probably be used in cases involving self-driving cars, Smith says.)

Do circuits deserve empathy?

Stephen Zoepf, the executive director of the Center for Automotive Research at Stanford, agrees that one interesting way of exploring crashes involving self-driving vehicles is to look for the “counterfactual.” In other words: Ask what might have happened if a human had been driving the car instead of the computer, and vice-versa.

But psychologically, thinking about accidents involving self-driving cars is different in a key way, says Zoepf. “We can usually empathize with a human driver that makes a mistake,” he says—it’s not hard to mentally put yourself in someone’s shoes who has screwed up in some way behind the wheel. (He touches on these same ideas in a recent commentary for Reuters.)

“When it comes to automated vehicles, we’ll be starting to face accidents that we don’t understand,” he adds. “Because fundamentally, humans and computer make different decisions, and they have different strengths and weaknesses.” We might technically understand what a self-driving car did (and in that respect, the company that made it or operates it has access to detailed data on the car’s operation) but empathizing with the decision is another story.

In other words, in a world where more and more cars will likely become automated in the future, the accident rate will probably decline, but thinking about those accidents may take a different kind of mental effort.

Original story: At Popular Science

More news

See more of Lemberg Law’s media coverage

Go to list of articles
Get Your No-Obligation
Case Evaluation

Send a secure message to our legal team.

Your Info Was Received

Thank you for requesting your free case evaluation. One of our staff members will call you shortly.

In the meantime, you are welcome to call us right away at 855-301-2100. We look forward to working with you to resolve your legal issues.

We need a little more info to start your case review.
Please fill in the red fields above.
warning iconWhat’s your name?
warning iconWhat’s your email address?
warning iconWhat’s your phone number?
warning iconBriefly describe the problem
Confidentiality Guarantee: We keep your information completely confidential and will not send you spam or sell your information.
By submitting above, I agree to the privacy policy and terms and consent to be contacted by an agent via phone call or text message at the phone number(s) listed above, including wireless number(s).