Focusing on LEDs makes robotic exoskeleton work
Korean and German researchers have built a mind-controlled robotic exoskeleton system for paraplegics that picks up brain signals when the user focuses on LED lights Korea University and TU Berlin

Researchers from Korea University and the Technical University of Berlin (TU Berlin) have developed a robotic exoskeleton suit that can enable paraplegics to control it using their minds.

In June 2014, Juliano Pinto, a 29-year-old paraplegic man, became the first person ever to use a mind-controlled exoskeleton to kick off the World Cup.

It took only a second to kick the football on the pitch in Brazil, watched by millions around the world, but the process to enable the teenager to control the exoskeleton suit required seven months of intensive training in front of a computer, as well as 12 years of research from Dr Miguel Nicolelis of Duke University and Dr Gordon Cheng of TU Munich before that.

But now researchers from Korea and Germany have found a different way to achieve the same result. Their exoskeleton suit system requires the user to wear an electroencephalogram (EEG) cap and then stare at a device facing them that has five LED lights embedded into it.

Focusing on the LEDs gives the signal for the suit to move

Each of the LEDs is operating on a different frequency, and when the user focuses his or her attention on one of the LEDs, the exoskeleton is able to identify that brain signal and move the robotic suit either forwards, turn it left and right, sit down or stand still.

The researchers say that one of the problems with exoskeleton suits currently being developed is that it is sometimes difficult for the suit to identify brain signals for movement out of all the brain's activity.

"Exoskeletons create lots of electrical 'noise'," explained Professor Klaus Müller of the machine learning department at TU Berlin's Institute of Software Engineering and Theoretical Computer Science, who co-authored the paper.

"The EEG signal gets buried under all this noise – but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal."

The system can be used on other existing robotic exoskeletons

The difference between the two solutions is that with the exoskeleton suit from the Walk Again Project shown in Brazil, patients have to physically train their brains to control a digital avatar on a computer screen, and then migrate that same technique and thought process to control the exoskeleton suit.

Also, the Brazilian suit is designed to become an extension of the patient and can be used anywhere, even outdoors. Its system is controlled by a Raspberry Pi computer, and the suit comes with artificial skin filled with sensors to give the user the feeling of pressure, touch and vibrations as they walk.

On the other hand, the Korean-German system only requires the user to focus on the LED lights on the device suspended in front of their line of sight, but the processing of the brain signal has to take place remotely and rely on a wireless EEG signal receiver and a separate signal processing unit located in the same room as the person using the exoskeleton suit.

The researchers want their system to be migrated to help control any other existing exoskeleton suit.

"People with amyotrophic lateral sclerosis (ALS) [motor neuron disease], or high spinal cord injuries face difficulties communicating or using their limbs. Decoding what they intend from their brain signals could offer means to communicate and walk again," said Müller.

"We were driven to assist disabled people, and our study shows that this brain control interface can easily and intuitively control an exoskeleton system - despite the highly challenging artefacts from the exoskeleton itself."

The paper, entitled "A lower limb exoskeleton control system based on steady state visual evoked potentials" is published in the Journal of Neural Engineering.