Described in Applied Soft Computing, the method can be used by a person to control a robotic arm through a brain-machine interface (BMI), which translates nerve signals into commands to control a machine.
Two main techniques monitor neural signals in BMIs, namely electroencephalography (EEG) and electrocorticography (ECoG).
EEG exhibits signals from electrodes on the surface of the scalp and is non-invasive, relatively cheap, safe and easy to use. EEG has low spatial resolution and detects irrelevant neural signals, which makes it difficult to interpret the intentions of individuals from the EEG.
ECoG is invasive and involves placing electrodes directly on the surface of the cerebral cortex below the scalp. Compared with the EEG, the ECoG can monitor neural signals with much higher spatial resolution and less background noise but the technique also has shortcomings.
“The ECoG is primarily used to find potential sources of epileptic seizures, meaning the electrodes are placed in different locations for different patients and may not be in the optimal regions of the brain for detecting sensory and movement signals,” said Professor Jaeseung Jeong, a brain scientist at KAIST (Korea Advanced Institute of Science and Technology). “This inconsistency makes it difficult to decode brain signals to predict movements.”
Professor Jeong’s team developed a new method for decoding ECoG neural signals during arm movement. The system is said to be based on an ‘echo-state network’, a machine-learning system for analysing and predicting neural signals, and Gaussian distribution, a mathematical probability model.
In the study, the researchers recorded ECoG signals from four individuals with epilepsy while they were performing a reach-and-grasp task. Because the ECoG electrodes were placed according to the potential sources of each patient’s epileptic seizures, only 22 per cent to 44 per cent of the electrodes were located in the regions of the brain responsible for controlling movement.
During the movement task, the participants were given visual cues, either by placing a real tennis ball in front of them, or via a virtual reality headset showing a clip of a human arm reaching forward in first-person view. According to KAIST, they were asked to reach forward, grasp an object, then return their hand and release the object while wearing motion sensors on their wrists and fingers. In a second task, they were instructed to imagine reaching forward without moving their arms.
The researchers monitored the signals from the ECoG electrodes during real and imaginary arm movements and tested whether the new system could predict the direction of this movement from the neural signals. They found that the novel decoder classified arm movements in 24 directions in three-dimensional space, both in the real and virtual tasks, and that the results were at least five times more accurate than chance. They also used a computer simulation to show that the novel ECoG decoder could control the movements of a robotic arm.
The team said the next steps will be to improve the accuracy and efficiency of the decoder, which could be used in a real-time BMI device to help people with movement or sensory impairments.
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...