Automotive News
Should Your Car Kill You To Save Others – The Self-Driving Dilemma

With no time to slow down, your autonomous car will either hit the pedestrians or swerve off the road, probably crashing and endangering your life. Who should be saved? In a split-second, the car has to make a choice with moral—and mortal—consequences. Three pedestrians have just blindly stumbled into an oncoming crosswalk.

A team of three psychologists and computer scientists, led by Jean-François Bonnefon at the UniversityJaeger-LeCoultre Replica Watches of Toulouse Capitole in France, just completed an extensive study on this ethical quandary. They ran half a dozen online surveys posing various forms of this question to U.S. residents, and found an ever-present dilemma in peoples’ responses. “Most people want to live a world in which everybody owns driverless cars that minimize casualties,” says Iyad Rahwan, a computer scientist with the team at MIT, “but they want their own car to protect them at all costs.”

Advertisement – Continue Reading Below

This isn’t just a trivial riddle or a new take on the trolley problem thought exercise. Now that computers are driving large metal machines that can kill, they’ll have to be programmed to make these kinds of decisions. “It’s a rather contrived and abstract scenario, but we realize that those are the sorts of decisions that autonomous vehicles are going to have to be programmed to make,” says Azime Chariff, a psychological researcher with the team at the University of Oregon.

Most Popular

“It’s also a big challenge to the widescale adoption of autonomous vehicles, especially when there’s already a basic fear about entrusting a computer program to zip us around at 60 miles an hour or more,” he says. “So we conducted a series of online experiments to gauge how people were thinking about these ethical scenarios and how comfortable they would be to buy autonomous vehicles that were programmed in various ways.” The survey results are outlined today the the journal Science.

The Self-Driving Dilemma

The scientists used the Amazon Mechanical Turk platform to conduct their surveys between June and November 2015, and paid 25 cents for each survey. Only American residents were polled.

Whether the choice was between their own car fatally crashing itself to save two, three, or ten pedestrians, “what we found was that the large majority of people strongly feel that the car should sacrifice its passenger for the greater good,” says Bonifan. “Even when people imagined themselves in the car, they still say that the car should sacrifice them for the greater good. And even when people imagine being in a car with a family member or even with their own child, they still said the car should kill them for the greater good.”

The numbers here were stark. In one survey, where the choice was between saving the car’s passenger or saving a crowd of ten pedestrians, more than 75 percent of respondents agreed that sacrificing the autonomous vehicle’s passenger was the more moral choice. In short, “most people agree that from a moral standpoint, cars should save the [maximum number of people] even if they must kill their passengers to do so,” Bonifan says.

“They want their own car to protect them at all costs.”

There is a big “but” coming. When given the option of hypothetically buying a self-driving car that’s utilitarian (it saves the greatest number of people) or one that’s selfish (programmed to save its passenger at all costs) people are quick to buy the selfish option. When it comes to utilitarian cars, “they tell us that it’s great if other people get these cars, but I prefer not to have one myself,” says Bonifan.

Economists call this feeling a social dilemma. It’s a bit like how most people view paying taxes. Yeah, everyone should do it. But nobody is too keen on doing it themselves.

What if Selfish Is Better?

When considering these thorny questions about whom self-driving cars should and should not kill, it’s easy to lose sight of the bigger picture, which is that autonomous vehicles have the potential to drastically reduce the number of car accidents and traffic-related deaths by eliminating human error, be it drunk drivers, distracted drivers, or good drivers who just make a mistake.”Just in the United States last year, there were nearly 40,000 traffic fatalities and about 4.5 million with serious injuries,” says Chariff at the University of Oregon. “Depending on how you calculate it, the dollar cost of those accidents approaches $1 trillion a year.”

“These cars have the potential to revolutionize transportation, eliminating the majority of deaths on the road.”

Just because the numbers say that self-driving cars will be safer, though, doesn’t mean people are ready to trust computers to take the wheel. Here, Bonifan and his colleagues suggest their findings could be useful to policymakers hoping to ensure the safest possible implementation of self-driving cars while still encouraging people to accept them. “These cars have the potential to revolutionize transportation, eliminating the majority of deaths on the road (that’s over a million global deaths annually) but as we work on making the technology safer we need to recognize the psychological and social challenges they pose too,” says Rahwan at MIT.

Oddly enough, “the best strategy for utilitarian policy-makers may, ironically, be to give up on utilitarian cars,” writes Joshua Greene, a psychologist at Harvard University (who wasn’t involved in the study), in an essay accompanying the paper. “Autonomous vehicles are expected to greatly reduce road fatalities. If that proves true, and if utilitarian cars are unpopular, then pushing for utilitarian cars may backfire by delaying the adoption of generally safer autonomous vehicles.”

Curious how you might approach these ethical self-driving car scenarios? The scientists published an interactive website today for you to explore them.

<div

Related Post