The system was developed by musicians at Plymouth University and electronic engineers at Essex University and then tested on a patient with locked-in syndrome — a severe form of paralysis.
‘We’re talking about patients who are completely locked in,’ Ramaswamy Palaniappan of Essex University told The Engineer. ’The brain is active but the rest of the body is practically dead, so the only form of communication is by using their thoughts, and what we’re trying to do is tap into this.’
While BCI systems have in the past allowed patients to ‘play’ music, the current system is claimed to take it to new level by varying the amplitude of the signal to string together different combinations of notes. In addition, Palaniappan said this was the first time in the UK such a system was trialled on an actual patient rather than laboratory volunteers.
The research team used a method called steady-state visual evoked potential (SSVEP), which combines electroencephalography (EEG) analysis with what the team terms a music engine module.
Participants sit in front of a computer screen that displays several ‘buttons’ that flash at different frequencies (normally between 8Hz and 16Hz).
The participant is asked to focus his or her attention on a particular button and the EEG device he or she is wearing captures that frequency — a phenomenon known as the field frequency-following effect. This frequency-tagged EEG signal is then matched to pre-specified note, or series of notes, played by the computer.
But where the current system differs from previous offerings is that it builds in a secondary level of control, where participants can control the intensity of their focus on the button to vary the composition.
For example, a sequence of five musical notes was stored in an array, and the patient could play the notes in sequence going up or down the scale by varying their level of attention. There is also helpful feedback for the participant, so as he or she focuses harder the flashing buttons get larger and visa versa.
The researchers trialled their system on a female patient at the Royal Hospital for Neuro-disability in London, who has locked-in syndrome, a form of almost total paralysis caused by brain lesions.
With practice, the patient was able to achieve quite a sophisticated level of musical control — and with a greater level of dexterity than the researchers themselves, according to Palaniappan.
‘The patient was able to use it for about two hours,’ he said. ’She was so excited and didn’t want to give it up. For me, it was like a lifetime achievement. To see an actual person using it and applying the technology that I’ve been trying for several years now was a real big thing.’
The team are now trying to improve the ergonomics and aesthetics of the system — for example, the EEG electrodes require a gel on the scalp to get a clear signal. It is also hoped the researchers will make the interface look more like a musical instrument.
Onshore wind and grid queue targeted in 2030 energy plan
NESO is expecting the gas powered turbines (all of them) to run for 5% of the time!. I did not realise that this was in the actual plan - but not...