Why Self-Driving Cars Will Favor The Lives Of Passengers Over Pedestrians

When facing an imminent crash, a self-driving car could save the lives of many pedestrians in its path — or save the couple of passengers on board. How should it be programmed to act?

Posted on

People can’t buy self-driving cars yet, but they already have strong opinions about how the vehicles should behave during a crash.

A study published Thursday surveying about 1,900 potential customers found that in a hypothetical crash scenario, most people believe a car should be programmed to save as many people as possible. But if it were their own car, survey takers came to a very different conclusion: Passengers’ lives should take priority over those of pedestrians.

Self-driving cars aren’t on the market yet: Prototypes like Google’s are still figuring out how to honk politely, and how to avoid run-ins with public transport. In the new surveys, the researchers presented potential customers with several versions of a popular psychology puzzle, set a few years in the future.

Here’s the hypothetical conundrum: An autonomous car notices a group of pedestrians enter its path, and has only two options: Barrel through the crowd causing multiple fatalities, or swerve away and crash, potentially killing the one or two passengers on board. Survey takers were asked: What should the car do?

If there are a lot of pedestrians in the group, survey takers said, then the answer is clear. Most responders said they would like to live in a world where their car decides to save the highest number of lives — even if that means killing themselves or their family members riding inside the car.

“There was a strong moral consensus on what a car should do if killing its passengers meant saving more people on the road,” Jean-François Bonnefon, a researcher at the Toulouse School of Economics, and a co-author on the Science paper describing the results, said at a briefing for reporters.

But that altruism disappeared when asked about the kind of car they would buy for themselves: 81% of responders said they would choose one that kept its passengers safe in any situation.

“They tell us that it’s great if other people get these cars, but I prefer not to have one myself,” Bonnefon said. “And I think you can see what they’re doing here. I mean, you can recognize the feeling: the feeling I that I want other people to do something, but it would be great not to do it myself.”

Unless there were laws about which algorithms cars must use, these preferences could “push people, nearly everybody, towards getting the self-protective cars,” Azim Shariff, a professor of psychology at the University of Oregon and a co-author on the study, told BuzzFeed News.

Eventually, this could lead car makers to advertise and manufacture cars that chose the safety of passengers over pedestrians. “Those are the ones that they’re going to see actual profit for,” Shariff said.

Google declined to comment on the new survey, and instead referred to comments Chris Urmson, who leads the company’s self-driving car project, made last year.

“It’s not possible to make a moral judgement of the worth of one individual person versus another — convict versus nun,” he said at a talk in Cambridge, Massachusetts. “When we think about the problem, we try to cast it in a frame that we can actually do something with.”

The researchers created a website where anyone can flip through scenarios presented in the survey, with the goal of getting people outside the technology sector engaged in ethical discussions about future technologies.

Lawyers and philosophers have been mulling the killer-car scenario in op-eds and essays for years, as part of a broader discussion about a future in which machines and algorithms are entrusted with choices traditionally made by people.

But the new survey of public opinion that includes non-experts is unprecedented, and therefore valuable, Ryan Calo, a professor of law at the University of Washington, told BuzzFeed News by email.

That said, Calo added, many technologists see this particular moral conundrum as unlikely. “Few people ever get into a situation like this … much less a machine that drives more conservatively, has sensors, and reacts faster than a person.”

Tech companies have acknowledged the value in consulting with people who will eventually use their products during the design process.

But basing ethical decisions on majority opinion hasn’t always worked well, Patrick Lin, a professor of philosophy at California Polytechnic State University who has written about the killer-car thought experiment, told BuzzFeed News by email.

“Think about all the times the ‘yuck factor’ has been on the wrong side of history, which is just about every time — blood transfusions, organ transplants, interracial relationships, same-sex marriage, and so on,” he said. “This survey seems to over-privilege our intuitions, which are often wrong.”


Nidhi Subbaraman is a Science Reporter for BuzzFeed News and is based in Washington, DC.

Contact Nidhi Subbaraman at nidhi.subbaraman@buzzfeed.com.

Got a confidential tip? Submit it here.