Kill the baby or the grandma?

What used to be an arcane problem in philosophy and ethics, The Trolley Problem, has been taking center stage in the discussions about the way autonomous vehicles should behave in the case of an accident. As reported previously in this blog, a website created by MIT researchers, The Moral Machine, gave everyone the opportunity to confront him or herself with the dilemmas that an autonomous car may have to face when deciding what action to take in the presence of an unavoidable accident.

The site became so popular that it was possible to gather more than 40 million decisions, from people in 233 countries and territories. The analysis of this massive amount of data was just published in an article in the journal Nature. In the site, you are faced with a simple choice. Drive forward, possibly killing some pedestrians or vehicle occupants, or swerve left, killing a different group of people. From the choices made by millions of persons, it is possible to derive some general rules of how ethics commands people to act, when faced with the difficult choice of who to kill and who to spare.

The results show some clear choices, but also that some decisions vary strongly with the culture of the person in charge. In general, people decide to protect babies, youngsters and pregnant women, as well as doctors (!). At the bottom of the preference scale are old people, animals and criminals. 

Images: from the original article in Nature.

One thought on “Kill the baby or the grandma?”

  1. Now if they would only extend that research. How would individual people respond depending on the obvious details of the potential victims? I’m thinking of color of skin, style of clothing, religious paraphenelia, economic status, aesthetic appeal, etc.

    These are factors that shape people’s decision-making on a daily basis, even if most often unconsciously. We know, for example, that police are more likely to perceive a black guy as carrying a gun even when he is not and more likely to perceive a white guy as not carrying a gun even when he is.

    The study you mention shows that people would prioritize hitting a criminal. That is a dangerous heuristic considering people are more likely to perceive minorities as criminals. If we program computers with data obtained from our society, we will inevitably program prejudices into the system. That would make prejudices even more systemic than they already are.

    https://benjamindavidsteele.wordpress.com/2018/08/28/we-cant-pretend-they-dont-exist-anymore/

    “We’re taking all of this data, a lot of it bad data, a lot of historical data full of prejudice, full of all of our worst impulses of history, and we’re building that into huge data sets and then we’re automating it. And we’re munging it together into things like credit reports, into insurance premiums, into things like predictive policing systems, into sentencing guidelines. This is the way we’re actually constructing the world today out of this data.

    “And I don’t know what’s worse, that we built a system that seems to be entirely optimized for the absolute worst aspects of human behavior, or that we seem to have done it by accident, without even realizing that we were doing it, because we didn’t really understand the systems that we were building, and we didn’t really understand how to do anything differently with it.”

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s