At the same time, someone, somewhere will have to write programming language telling the car's computer "brain" what to do in case of an accident where another person is involved. Society as a whole would benefit from safer driverless cars, as long as people buy into the idea.
"When we asked them if they wanted to have such a car for themselves, they tell us its great if other people get these cars, but I prefer not to have one myself," said study co-author Jean-François Bonnefon, a psychological scientist at the Toulouse School of Economics (France) for the Centre National de la Recherche Scientifique.
The problem with driverless cars and who to protect has already been felt in the aviation world. Technology that prevents a bad decision by a pilot (or a suicide mission) known as "controlled flight into terrain avoidance" already exists but has not been implemented by major commercial airline companies, according to Ella Atkins, professor of aerospace engineering at the University of Michigan and a consultant to several car companies on autonomous driving.
RELATED: 10 Wild Ways We'll Travel
Atkins believes autonomous cars will be used well before autonomous planes. She believes the answer is to come up with a federal standard on crash avoidance that can't be overridden by individual car companies or car owners.
"Somebody somewhere has to decide what the weights are in the cost of braking for the person versus swerving and going over the cliff," Atkins said about auto engineer programming. "The optimization will come up with extremely high cost for both decisions, it will be the weights decided by a consortium, provided by government, not by a car company."
Both Atkins and the study authors note that improvements in vehicle safety systems such as automatic braking and sensors to detect and predict the path of pedestrians and other vehicles will make some of these scenarios less likely, or at least extremely rare.
Perhaps by then, car buyers will get used to the idea of someone else making life-or-death decisions. The authors have launched the Moral Machine website to discuss these technological and ethical issues more in depth.
A second expert believes that building trust between people and the machines that drive them will go a long way in avoiding these kind of no-win scenarios. This happens by slowing down in crowded areas, or when the weather is bad, for example.