Technology that can feel

On the path to intuitive mobility: at the Fraunhofer NeuroLab, scientists are developing emotionally sensitive brain-computer interfaces.

The truck ahead is moving very slowly. Better switch on the indicator. Smoothly, the overtaking assistant initiates a lane change. The live emotion identifier indicates that the driver is relaxed, the speed is pleasant and the exit angle is ample. The emotional situation is ideal, the technology ensures that things stay this way through the end of the journey.

In the past, interacting with technical equipment has not always been so intuitive. More often, we as users have had to adapt to machines based on how they operate. However, that might soon change. New neurotechnology is opening a communication channel between the human brain and computer technology that works in both directions. Two researchers, Kathrin Pollmann and Mathias Vukelić, are paving the way in this area. They are human-computer interaction experts at the Fraunhofer Institute for Industrial Engineering (IAO). Their objective is to develop technology that fulfils our wishes even before we have articulated them.

The aim of research at NeuroLab: Technology that feels good
Kathrin Pollmann, User Experience Research at the Fraunhofer IAO

‘Neurotechnology can already detect a user’s mental and emotional state,’ Kathrin Pollmann explains. ‘But even the most intelligent systems are unable to react accordingly.’ Pollmann and Vukelić have just taken another major step towards facilitating a real human-computer interaction; together with other research partners, they have developed a brain-computer interface that can detect emotions, as part of the EMOIO project.

The project, financed by the German Federal Ministry of Education and Research, is currently focussing on driver assistance systems. This is a particularly exciting area for the researchers, as sensor-based technology, widely utilised in cars, is already taking over many tasks on behalf of humans. Soon it could focus on the individual needs of a user and adapt to them—technology with a feeling for feelings, so to speak.

Holding a directional microphone to the brain

Kathrin Pollmann and Mathias Vukelić work at the NeuroLab of the Fraunhofer IAO. It is a neuroscientific test lab, which, at first glance, looks like an electronics workshop. On one side there are desks with computers, on the other there are glass busts wearing caps dotted with sensors. And lots and lots of cables. The most important instrument of any trial is the electroencephalography (EEG). Researchers use it to measure human brain waves, which vary depending on what a person is thinking about or imagining.

To measure the activity of the brain, the testers place a cap covered with electrodes on each research subject's head. The subjects are then asked to solve simple tasks on the computer. Subsequently, the sensors transmit the brain signals to an amplifier and then to a computer which turns the signals into wave-shaped lines. But how does one identify positive or negative emotions based on the storm of electric impulses produced by a hundred billion nerve cells? Which signal means what?

Dr Mathias Vukelić, neuroscientist and behavioural scientist

A fan club for every emotion

‘Imagine you are trying to record football fans singing in a stadium. If you were to put a microphone on the playing field, you would not be able distinguish much, only a blur of ten thousand voices. If you were to take the microphone closer to one of the fan blocks, you would better understand what they are singing—provided they are singing in unison,’ Mathias Vukelić explains. In a similar way, an EEG sensor measures the uniform activity of several thousand groups of nerve cells, which briefly form a sort of fan club around one emotion. This activity can then be detected as a change in electric potential on the scalp.

The EEG has the advantage of being able to record this change in a matter of milliseconds. In addition, functional near-infrared spectroscopy (fNIRS) measures the oxygen level in the blood and can thereby detect which area the brain is active. With this combination of methods, emotions can be reliably deduced, interpreted and categorised.

Such as frustration. To trigger this emotion, the test technology hinders the test person from completing a task on the computer, which results in the person getting annoyed. This is an interesting moment for the researchers. What does frustration look like in terms of brain activity? What kind of neurological pattern does a negative emotion produce? Ultimately, the goal is to develop an algorithm that can draw on a store of positively and negatively categorised patterns and produce appropriate technical reactions. With a vehicle, for instance, driver fatigue and tension might automatically cause the speed of the vehicle to be reduced or the distance to the vehicle ahead to be increased.

‘Like’ or ‘dislike’

A first practical tool already exists. As part of the EMOIO project, the team developed a brain-computer interface. It functions much like the test units in the NeuroLab, except that it is faster. In this case, an EEG and NIRS record the brain’s activity. An algorithm analyses the reaction in real-time and compares it to previously recorded positive/negative signals. Lastly, it is assigned one of two possible categories: ‘like’ or ‘dislike’. This all happens in real-time. As visual representation, a mobile app displays the user’s emotional state as it occurs, in addition to an emoticon and a line diagram.

A large step towards achieving a real human-computer interaction.

Dr Mathias Vukelić, neuroscientist and behavioural scientist

To the layman, this may seem quite simple, but Mathias Vukelić sees this as a major step towards achieving a human-computer interaction in a real-life setting—and towards developing an intuitive mobility. After all, the prototype, which currently only makes general distinctions, is only the beginning; with this knowledge, self-learning systems can be developed for mobility or work-related purposes and can be adapted to any user.

An assistance system can detect what is good for a driver, based solely on neurophysiological data

An assistance system in a car might then be able to detect what is good for the driver, based solely on neurophysiological data. If rejection as a feeling is detected, the system will automatically make another decision, based on the neurological pattern it has previously learned. Provided that data security is ensured, an individual and direct relationship between the human and the technology is facilitated. Most importantly, the system only does what the driver has trained it to do.

Besides emotions, the Fraunhofer researchers are also interested in tracking the driver’s level of concentration. If the car is in autonomous driving mode, for instance, he or she could intervene at any given moment, should the need arise. Constant monitoring of the driver’s mental strain, as well as a ‘waking’ or activating function based on the driver's state of being, would be some key safety features.

Wearable device measures brain activity

The advancements in the development of human-robot collaboration and autonomous driving call for new solutions. We have to be courageous, Pollmann states. ‘There are cases of expensive or difficult tests that never get past the testing stage. That's why we use scenarios with a practical application for our research. Under these circumstances, we can test what mobile methods such as EEG and fNIRS are capable of and how they can be used effectively.’ The equipment still seems somewhat cumbersome. After all, no one wants to sit in a car wearing an EEG bonnet. But in the future, material research will make large leaps, according to Mathias Vukelić, who can imagine people wearing headgear made by a 3D printer.

The NeuroLab’s gallery of what may be described as scientific swim caps shows that the thickness and size of the sensors has already been reduced. The first models were huge ring electrodes, where lots of gel had to be used to create a contact between the sensors and the scalp. Now the team mainly uses caps with far smaller EEG sensors, which require less gel—or none at all. Elegant solutions in the future could be headbands or fitness bracelets, which no longer seem to disturb anyone today.

Being able to decide freely is very important to people.

Kathrin Pollmann, User Experience Researcher at the Fraunhofer IAO

When do human-computer interaction systems feel good? Kathrin Pollmann would like to see human-computer systems that deliver a real value to the user, as well as being user-friendly and fun. With regard to autonomous driving and intuitive mobility, making the involved processes transparent is a top priority. ‘The user needs to have the feeling that he or she is in control, that's something we’ve found with all our projects. The driver should be able to decide whether the car does something or whether he or she does it, instead. Overall, being able to decide freely is very important to people.’

Glossary

A sequence of instructions for solving a problem or task. Algorithms consist of a finite number of defined individual steps. These can be implemented for use with a computer program.

Method for studying the electrical activity of the brain. Changes in electric potential on the scalp can be recorded within milliseconds, so that deductions can be made regarding perception and thought processes.

Project funded by the German Federal Ministry of Education and Research to develop emotionally sensitive assistance systems that can adapt to a user's feelings and individual needs.

At the Fraunhofer IAO in Stuttgart, scientists research how we will work and live in the future and apply their findings within a results-oriented framework.

Non-invasive imaging method used to examine how the brain works. Using light emitting diodes, infra-red light is channelled into the tissue underneath the surface of the brain. The reflection is then measured by sensors, which detect variations in oxygen consumption across the brain. This allows researchers to record which areas of the brain are most active and draw conclusions.

A brain-computer interface (BCI) that translates a person's brain activity into commands for a technical device.

Area of research dedicated to examining the design and use of computer technology at the interface between humans and computers. It draws on results and knowledge from the fields of computer science, psychology, work science, cognitive science, ergonomics, sociology and design. Often abbreviated as HCI.

The Laboratory for Neuroscience (NeuroLab) is a testing area for topics related to neuroscience at the Fraunhofer IAO in Stuttgart. The NeuroLab researches the function of the brain when people use technical equipment.

At a glance

  • Fraunhofer NeuroLab
  • Kathrin Pollmann
  • Dr Mathias Vukelić

The Fraunhofer IAO in Stuttgart opened its NeuroLab in 2015 as a test environment for questions related to neuroscience. Researchers apply neuroscientific results to enable human-friendly work models. The focus is on assistance systems used in vehicles, with human-robot collaborations and with scientific work.

User Experience Researcher at the Fraunhofer IAO, studies emotions and psychological phenomena related to human-computer interaction, in order to promote positive experiences with technical products and enable a human-friendly digitalisation process.

Neuroscientist, behavioural scientist, and researcher at the Institute in the field of neuroscience. His great curiosity and passion for science processes drives his research, which currently involves the identification of activity patterns of the brain, based on cognitive and emotional states that occur when humans use technology. Another field of his work is the development innovative brain-computer interfaces.

Usage of cookies

In order to optimize the website and for continuous improvement Daimler uses cookies. You agree to the usage of cookies when you continue using this site.