Brainwave reader helps people with MND communicate with thoughts

People unable to communicate through speech or eye movement can use their thoughts to indicate ‘yes’ or ‘no’ thanks to a brainwave reader developed by Nottingham Trent University (NTU).

The AI system learns the ‘signatures’ or ‘features’ that the EEG signals create for different imaginations
The AI system learns the ‘signatures’ or ‘features’ that the EEG signals create for different imaginations - NTU

Amin Al-Habaibeh, Professor of Intelligent Engineering Systems within the Product Design team at NTU, wanted to support charities which help people with advanced Motor Neurone Disease (MND) and Completely Locked-in Syndrome after his brother-in-law passed away aged 38 after having MND.

The research has led to the development of an affordable brainwave reader made with off-the-shelf parts and a novel artificial intelligence (AI) algorithm developed by the research team.

The technology centres on interpreting people’s brain signals when they are invited to envisage contrasting imaginary situations – or imaginations - to indicate ‘yes’ or ‘no’ answers.

Professor Al-Habaibeh said off-the shelf components include three low-cost electroencephalography (EEG) boards that allow insight into brain activity by amplifying minute brain signals. The amplified signals are then captured using off-the-shelf data acquisition board to interface to the computer. Then the software captures the signals for signal processing and AI implementation.

MORE FROM ARTIFICIAL INTELLIGENCE

“Different activities of the brain are expected to produce different signals at different locations,” said Professor Al-Habaibeh. “Our system captures the brain signals and seeks to identify their ‘signatures’. The underpinning research shows, for example, that identifying three ‘imaginations’ have an average accuracy of 90 per cent with a maximum success rate of 100 per cent; and we have found that the success of the system depends on the focus of the person to capture high quality signals for training of the AI and the identification of the imagination or person’s thought.”

He added that researchers have tried to predict words by asking volunteers or patients to think of the exact words they are trying to say, but with little success.

“Our novel method is much more successful because we can create different EEG signatures based on specific imagination and then link them to what the person is trying to say or do, such as ‘yes/no’ or controlling a computer mouse on the screen.”

He continued: “We also use novel signal processing and AI algorithms to detect the EEG features - or signatures - to predict what the person is trying to say or do via imaginations.”

The cost of hardware for each reader is estimated at around ÂŁ300, and the research is being published under a creative commons licence.

The overall aim is to make the technology affordable so that it can be used more widely by families or hospices, rather than roll it out commercially.