The code behind the decision

A study published by scientists at Osnabrück university shows that machines can learn to make correct decisions.

Since the invention of the automobile, progress has been determined exclusively by technical and physical parameters. Ethical and moral standards played only a minor role, if any at all. With the development of autonomous driving, this is now changing. Technicians are mainly utilising existing sensors and assistance systems, in addition to collecting data for enabling future vehicles to weave completely accident-free, as if guided by a magic hand. Simultaneously basic moral and ethical questions are coming up, which will have to be answered before this new technology can be rolled out on a large scale.

In critical situations, when someone could potentially be injured, a human driver calls on his or her moral and ethical consciousness to avoid more serious consequences. The automobile, a soulless machine, is not familiar with these principles. However, a study at the Institute for Cognitive Science at the University of Osnabrück has shown that human morals can be replicated in a way that machines can indeed make moral and ethical decisions.

For this study, 105 people wearing VR headset had to react to actual traffic situations in a foggy suburban setting. In doing so, they became subjects in so-called dilemma situations, involving objects such as bales of straw but also animals or humans. They had to decide what or who should be protected from a potential collision. During their virtual journey, the participants were travelling at a speed of 22 mph at first, and then at 55 mph. ‘At a certain distance, we simulated a wall of fog to evaluate the distance at which the participants recognised the obstacles,’ Leon Sütfeld, the author of the study, explained. ‘The reactions were so similar that we could make a relatively precise prediction regarding the participants’ behaviour.’ The participants' average age was 31.

The test showed that among the participants there was a high degree of similarity regarding their moral decisions

‘Moral decisions are not based on logic, but on human intuition and a complex assessment of a number of very different aspects. We concentrated on the dilemma situation to observe how humans behave. This way, we were able to assess how precisely we can predict or model their reactions, before giving cars the chance to make similar decisions.’ The test showed that among the participants there was a high degree of similarity regarding their moral decisions, so that ‘we are able to fairly accurately predict many people's decisions using a single model.’

In the second step, the test results were transferred to a computer, but ten percent of the results were omitted. ‘First we taught 90 percent of the data to the computer as part of the Cross Validation. Then we used the remaining ten percent to register the accuracy of our prediction.’ In doing so, the computer was given the 90-percent data set ten times, although ‘the result is pretty much the same each time,’ Sütfeld said. If the computer knows the actual result and realises that it has made a mistake, it corrects the parameters accordingly and gradually ‘learns’ to make the correct decision.

If the computer knows the actual result and realises that it has made a mistake, it corrects the parameters accordingly and gradually ‘learns’ to make the correct decision.

 

In June, the ethics commission on automated driving set up by Minister of Transport Alexander Dobrindt presented its final report. It states, ‘Technology has to be designed in a way that critical situations do not arise in the first place. These include dilemma situations, in other words situations where an automated vehicle is faced with circumstances that requires carrying out one of two necessary evils, which cannot be properly weighted.’ Furthermore, the experts stated, ‘Technical systems…must be designed to avoid accidents. However, they cannot be standardised to make complex or intuitive assessments regarding the consequences of an accident, so that they could replace or pre-empt the decision of a morally conscious driver.’

Usage of cookies

In order to optimize the website and for continuous improvement Daimler uses cookies. You agree to the usage of cookies when you continue using this site.