What used to be an arcane problem in philosophy and ethics, The Trolley Problem, has been taking center stage in the discussions about the way autonomous vehicles should behave in the case of an accident. As reported previously in this blog, a website created by MIT researchers, The Moral Machine, gave everyone the opportunity to confront him or herself with the dilemmas that an autonomous car may have to face when deciding what action to take in the presence of an unavoidable accident.
The site became so popular that it was possible to gather more than 40 million decisions, from people in 233 countries and territories. The analysis of this massive amount of data was just published in an article in the journal Nature. In the site, you are faced with a simple choice. Drive forward, possibly killing some pedestrians or vehicle occupants, or swerve left, killing a different group of people. From the choices made by millions of persons, it is possible to derive some general rules of how ethics commands people to act, when faced with the difficult choice of who to kill and who to spare.
The results show some clear choices, but also that some decisions vary strongly with the culture of the person in charge. In general, people decide to protect babies, youngsters and pregnant women, as well as doctors (!). At the bottom of the preference scale are old people, animals and criminals.
Images: from the original article in Nature.