The recent fatal accident with a Tesla in autopilot mode did not involve any difficult moral decisions by the automatic driving systems, as it resulted from insufficient awareness of the conditions, both by the Tesla Autopilot and by the (late) driver.
However, it brings to the fore other cases where more difficult moral decisions may need to be made by intelligent systems in charge of driving autonomous vehicles. The famous trolley problem has been the subject of many analyses, articles and discussions and it remains a challenging topic in philosophy and ethics.
In this problem, which has many variants, you have to decide whether to pull a lever and divert an incoming runaway trolley, saving five people but killing one innocent bystander. In a variant of the problem, you cannot pull a lever, but you can throw a fat man from a bridge, thus stopping the trolley, but killing the fat man. The responses of different people vary wildly with the specific case in analysis.
These complex moral dilemmas have been addressed in detail many times, and a good overview is presented in the book by Thomas Cathcart.
In order to obtain more data about moral and difficult decisions, a group of researchers at MIT have created a website, where you can try to decide between the though choices available, yourself.
Instead of the more contrived cases involved in the trolley problem, you have to decide whether, as a driver, you should swerve or not, in the process deciding the fate of a number of passengers and bystanders.
Why don’t you try it?