Advertisement

technologyTechnology

Should Your Car Be Programmed To Kill You?

author

Jonathan O'Callaghan

Senior Staff Writer

clockJun 23 2016, 19:00 UTC
chombosan/Shutterstock

Like it or not (although we can’t imagine why you wouldn’t), driverless cars are on their way, with some reports suggesting 10 million could be on the road by 2020. But with their adoption comes a few problems to be ironed out, and perhaps none are more controversial than the so-called “greater good” scenario.

The issue is this: In a few very rare scenarios, a driverless car may have to make a choice between protecting its occupants and protecting pedestrians. For example, if it's driving down a road at speed and someone runs out into the road, should it swerve into other traffic to avoid them, potentially injuring or killing the driver and passengers? Or should it make every attempt to stop, even though it knows it won’t be able to, killing the pedestrian?

Advertisement

(For more on these “trolley problems,” the researchers have released an interactive website to accompany the study called "The Moral Machine.")

This social dilemma is discussed in a study from a team of researchers published in the journal Science. “Autonomous vehicles (AVs) should reduce traffic accidents, but they will sometimes have to choose between two evils, such as running over pedestrians or sacrificing themselves and their passenger to save the pedestrians,” they note in their abstract.

In the study, the scientists sought to find out what the public’s view was on situations like these. Using Amazon’s Mechanical Turk public opinion tool, the researchers conducted six surveys between June and November 2015.

Their results showed that, for the most part, people thought that the car should attempt to save as many people as possible: 76 percent thought a car should sacrifice one passenger in favor of 10 pedestrians. But support fell by a third when asked if they would be willing to be that passenger. People were less likely to approve a utilitarian approach when, for example, family members were in the car.

Advertisement

"Most people want to live in a world where cars will minimize casualties," said co-author Iyad Rahwan from the Massachusetts Institute of Technology in a statement. "But everybody wants their own car to protect them at all costs."

This in itself raises a huge number of complications. Whether cars can be programmed to have morality is a legal minefield, and actually how they should behave in such scenarios is not clear. During a press conference with the authors, we asked where things stood at the moment, considering driverless cars are already on the roads in some countries.

“At this point, we are mostly focusing, in terms of the technology, on just making those cars safer,” Rahwan said. “My understanding is that car manufacturers are working on improving safety across the board, and they are not yet at a point where they are dealing with these types of scenarios. But I think very shortly they will have to.”

We also asked three leading driverless car manufacturers – Tesla Motors, Google, and Faraday Future – for their thoughts on the matter, but all three did not respond to a request to comment.

Advertisement

Programming driverless cars to have different levels of morality would no doubt cause a public outrage, so in the press conference the authors noted that perhaps only self-protective versions should be available, without complicating matters with “greater good” scenarios.

Above, a still from the interactive website "The Moral Machine," which accompanies this study

Nonetheless, there is little doubt this will be an issue in the future, especially when a driverless car does accidentally kill someone – passenger or otherwise – which is surely on the horizon. Consider that the day when one of Google’s self-driving cars accidentally bumped into a bus, it made the news around the world. “But in regular accidents, thousands of people died that day, but that was not covered,” the authors said in the press conference.

Driverless cars will bring improved safety for us all, dramatically reducing human-caused accidents. But if you thought the gun control debate (or lack thereof) in the US was ridiculous, you ain’t seen nothin’ yet when the first driverless car casualty arises.

Advertisement

(Note: We covered the pre-print of this study in our October 2015 article: “Should A Self-Driving Car Kill Its Passengers In A “Greater Good” Scenario?”)


technologyTechnology
  • tag
  • cars,

  • driverless,

  • self-driving,

  • greater good,

  • dilemma