Locked-in syndrome is a rare neurological disorder. It is often triggered by amyotrophic lateral sclerosis (ALS), an incurable degenerative disease of the motor nervous system. Affected individuals run the risk of losing complete muscle control while consciousness and mental functions remain intact. This means that affected persons can see and hear, but usually only the eyelids remain as a means of communication. Decisive facilitation of communication with locked-in patients is expected from brain-computer interface (BCI) technologies. These are based on the finding that even imagining a behavior triggers measurable changes in electrical brain activity. For example, imagining moving a hand or foot leads to activation of the motor cortex.
Brain-computer interface technologies are divided into invasive and non-invasive methods. In non-invasive methods, brain activity is measured using electrodes manually attached to the scalp. The measurements are based on electroencephalography (EEG), which has the disadvantage of low signal resolution and limited accuracy.
Invasive Brain-Computer Interface Technologies
Invasive methods can compensate for these weaknesses by using electrodes for electrocorticogram measurements, implanting them over the motor cortex. To date, the applicability of invasive brain-computer interface technologies still lacks the desired miniaturization and high spatial resolution, as this requires a large number of measurement points in a small space. In addition, the software cannot yet be applied by the subjects alone. For both the EEG-based and intracortical systems, calibration must be repeatedly carried out to bring the algorithms back to the current state of the day, explains Professor Gernot Müller-Putz from the Institute of Neurotechnology at Graz University of Technology, Austria.
He is currently conducting research in the European research consortium INTRECOM, which aims to solve these problems. The implantable technology will be able to decode speech in real time from brain signals. Locked-in patients will thus for the first time have a complete and easy-to-use communication system with which they can speak and control a computer cursor.
Presentation of a behavior
The consortium of research and industry partners is led by Professor Nick Ramsey of the Dutch University Medical Center UMC Utrecht. He has already shown in preliminary work that an attempted hand movement can be detected and used as a mouse click. This works similarly to assistive technology, whereby individual letters are scanned and the patient can select and click on letters, Professor Müller-Putz explains.
He himself just completed the EU project Feel Your Reach, in which he was able to calculate the trajectories of presented arm movements with a certain probability from EEG signals. This technology is to be further refined in the current project. At Graz University of Technology, Austria, the focus so far has been on non-invasive brain-computer interface technologies. Together with Professor Ramsey, Müller-Putz is now working for the first time with electrocorticogram (ECoG) measurements. Here, the material on which the electrodes are fixed – the so-called array – rests directly on the motor cortex.
Two research approaches
To safely make research progress, the research partners are taking two approaches: Team Ramsey wants to generate speech from attempted speech, meaning that the researchers evaluate the person’s attempt to produce the individual sounds of a spoken word. In this way, they can read what the person is trying to say from brain signals in real time.
The Müller-Putz team is focusing on any additional form of communication that can be described using cursor movements, from simply selecting icons on the screen to cursor movements and choices that the patient can control.
The Brain-Computer Interface hardware consists of an array of electrodes – called an array – and a biosignal amplifier. While the array of electrodes is placed over the motor areas, the biosignal amplifier is implanted in the skull bone. The latter has the task of processing the data and transmitting them wirelessly to external computers for analysis and decoding.
Miniaturization vs. high resolution
Among the technical challenges is the aforementioned miniaturization, a prerequisite for implantation. When recording brain signals, high spatial resolution is required. That means a very large number of measurement points in relation to the size of the array. The smaller the array, the more densely the electrodes must be arranged. The temporal resolution is measured in the millisecond range. Both high spatial and high temporal resolution are fundamental to decoding speech in real time.
To convert the brain signals into spoken words, algorithms are used to extract parameters from the measurement data. These describe whether the mouth wants to generate sounds or the hand wants to move the cursor. Ultimately, the system still needs to be embedded in software that works without technical experts in a home application. To do this, the system must be easy to use and robust, while leveraging the latest AI-based and self-learning technologies.
Industry partners
Two industrial partners in the consortium are responsible for designing the hardware: the Swiss-based Wyss Center for Bio- and Neuroengineering is designing the biosignal amplifier, and German medical device manufacturer CorTec will develop parts of the implantable electronics that record brain signals: custom high-resolution ECoG electrodes with high-channel wiring.
“The individual components already exist in different designs. We’re now going to refine them and bring different things together for the first time such that we can implement them appropriately. That’s the exciting part,” says Müller-Putz. The brain-computer interface will be tested on two people with locked-in syndrome in Utrecht and in Graz.
About the INTRECOM project
The project is scheduled to start in the fall. Professor Müller-Putz is currently working on the preparations and is still looking for interested postdocs and PhD students for the team at the Institute of Neurotechnology at Graz University of Technology, Austria.
Intracranial Neuro Telemetry to REstore COMmunication (INTRECOM) was selected by the European Innovation Council (Pathfinder Programme) and is funded by the EU with almost four million euros. The project will run from fall 2022 to fall 2026.