Would you get into an automated self-driving vehicle, knowing that in the event of an accident, it might sacrifice your life if it meant saving the lives of 10 other people?
Autonomous vehicles (AVs), also known as self-driving vehicles, are already a reality. Initial guidelines from the National Highway Traffic Safety Administration regarding this technology are expected by this summer, and road tests are currently in progress across the country.
But one barrier to the widespread use of autonomous vehicles is deciding how to program these vehicles' safety rules in the most socially acceptable and ethical way.
Google: What It Learned From 1.7 Million Miles on the Road No Driver, No Problem: How Google's Self-Driving Car Transforms Travel Blind Man Tests Google's Self-Driving CarAfter a six-month survey, an international team of researchers published their findings today in the journal Science and found the public has a conflicted view about the future of self-driving technology.
Scientists used Amazon's Mechanical Turk platform, an online marketplace where people are asked to conduct human intelligence tasks computers aren't yet able to do.
Over the course of six months, researchers conducted six online surveys, with 1,928 participants total, asking people to evaluate the morality of several different situations a self-driving vehicle may one day encounter.
One survey found that 76 percent of participants felt it would be more moral to sacrifice the life of one passenger than to kill 10 pedestrians. People still upheld the utilitarian view, which is to maximize the number of lives saved, even when asked to consider that their family members were present in the car.
While most were in favor of an outcome saving the most lives, survey results also indicated that participants would be less likely to purchase a car that followed this principle, with people instead preferring a car that would be more protective of themselves and their families. They also expressed reluctance to accept governmental regulation of self-driving vehicles.
The researchers call this situation, in which conditions could be made less safe for everyone by individuals acting in their own self-interest, a "social dilemma."
"You can recognize the feeling; the feeling that I want other people to do something but it would be great not to do it myself," Jean-Francois Bonnefon, co-author of the study, said during a teleconference with reporters.
The authors of this study note that the benefits of autonomous vehicles are numerous and include significantly eliminating the number of traffic accidents that occur each year, reducing pollution, increasing traffic efficiency, and enabling the elderly and disabled to move around more easily.
"Autonomous cars have the potential to revolutionize transportation, eliminate the majority of deaths on the road," Iyad Rahwan, another author of the study, said during the teleconference. "That is over one million global deaths annually. But as we work on making the technology safer, we need to recognize the psychological and social challenges."
Curious to further explore some of the potential situations that may arise with an AV? The Scalable Cooperation group at the MIT Media Lab has created a game which allows users to interactively explore the types of decisions these cars will be programmed to make.
Dr. Maryam Jahdi is a psychiatric resident at The Ohio State University Wexner Medical Center. She is a resident in the ABC News Medical Unit.