Thought control

Harnessing the power of the mind to make real change for those with disabilities.

In the not so distant future, could it be possible for people with disabilities to operate an autonomous car through their thoughts – Brain Computer Interface may be one answer.

5 August, 2021


The ‘brain-computer interface’ enables cognitively and motorically impaired people to communicate with their environment via mental commands

A dark room. A screen. And Cornelia Engel, whose head is adorned with a futuristic headset. Her eyes scan the display as a dot speeds across the screen. Suddenly, music starts to play out of nowhere. What’s going on? Is this a trick? No! Cornelia started the music – only with the power of her thoughts. How did she do it? To answer this question, we need to go back in time.

Everything started with Cornelia’s cousin Markus Burkhart, who was diagnosed with multiple sclerosis several years ago and is now paralysed from his head down. Despite this limitation, he runs his own car repair shop and with the help of eye-tracking and special computer programs, even carries out the administrative work. Although this may be somewhat tedious and time-consuming, it does work.

“I find it fascinating to observe him,” Cornelia explains. “Markus would give anything to be able to drive a car again and to be independent. I wondered if it would be possible to operate a car with an integrated eye-tracking system in combination with a brain-computer interface for him.” And so, she found a topic for her thesis at Audi in the Design Interior Interface area.

During her studies at the Mediadesign University of Applied Sciences in Munich, Cornelia Engel got to know the Brain Computer Interface (BCI) of EMOTIV.

The ‘brain-computer interface’ enables cognitively and motorically impaired people to communicate with their environment via mental commands. The mobile EEG device registers the electrical activities of the nerve cells and a computer then translates these signals into commands and passes them onto a device – for example a computer program, a wheelchair, a light switch or even a car.

With this approach of reading brain activity to translate thoughts into actions, Cornelia has not only optimised usability, but also made autonomous driving accessible to more people. Physically impaired people, like her cousin, can benefit from this. 

In the autonomous concept car the Audi AI:CON, presented at the IAA 2017, or its predecessor AI:ME, the passenger can operate the graphical interface with an eye-tracking system in addition to touch and voice control. Several infrared sensors are used to detect which display area the passenger is looking at, then the function shown there is displayed on a larger scale. To activate it, the passenger needs to tap on the touch-sensitive wooden screen. However, Markus isn’t able to type anymore. But the Brain Computer Interface (BCI) offers a solution. 

That was the theory. Cornelia reveals how difficult it was in reality: 

“I learned a command – that was typing – in around two weeks. This usually takes that long, because the BCI must first be calibrated for the user that in turn requires great practice and maximum concentration. A concise and stable idea must come about for a clear signal to emerge. Cornelia compares this procedure to the process of a baby learning to grab things or speak. 

As a first step, she began to meditate so her brain activities became calm and balanced. Only then could she assign swings or impulses to a command in the second step. 

“I imagined myself singing ‘forward’  that’s my impulse to use the ‘tap’ command,” she explained. “It works differently for everyone. For example, my friend thought of the colour green.” 

In order to use the eye-tracker and BCI in a car’s interior, both systems must be integrated into the Audi Graphic User Interface (GUI). Cornelia put this into practice in her bachelor thesis. She developed seven commands for this purpose and processed them graphically and conceptually: Left, right, up, down, clockwise and counter-clockwise rotation and typing.

As a first step, she began to meditate so her brain activities became calm and balanced

But the way has been paved. And that would mean a better life quality for Markus and many others with disabilities

For all this to work, other areas of the brain must be stimulated for each command. Controlling different commands directly one after the other requires maximum concentration. “But when it becomes second nature to somebody, it’s like riding a bike,” says Cornelia. 

Once it’s learned, the Brain Computer Interface has a high level of operating safety. If this system is additionally linked to an eye-tracker, it’s not necessary to wait until the desired control panel lights up but it can be targeted immediately. This increases the speed of operation. Just the right user experience for people with motor disabilities.

Little did Cornelia know how the Brain Computer Interface would create outstanding new opportunities for her cousin Markus. An example from everyday life: He orders an Audi to his home via app and eye-tracking. He controls his wheelchair with the BCI. The car recognises his smartphone via Bluetooth, opens the door, lowers the ramp and the seats fold back so that Markus can drive in it. He is welcomed by PIA, Audi’s individual language assistant. The air conditioning adjusts to his favourite ambience. In the meantime, his Brain Computer Interface connects to the Audi so that he can control the applications on the display. Music selection, volume, even a stopover is possible. When he arrives at his destination, the doors open, the ramp extends and Markus can drive out. 

A vision of the future? So far, yes. There is still a lot to do before this can become a reality – workshops to learn the commands, an even more mature technique and, of course, the legal scope must be clarified. But the way has been paved. And that would mean a better life quality for Markus and many others with disabilities – simply through the power of thought and Cornelia’s work.