Published in the journal PLOS ONE, the study demonstrates the use of radio waves to measure heart rate and breathing signals to predict how someone is feeling in the absence of other visual cues such as facial expressions.
PhD student at QMUL, Achintha Avin Ihalage, said their proposed deep learning approach is a novel neural architecture that can process time-dependent wireless signal and frequency-domain wavelet transformation images simultaneously while preserving temporal and spatial relationships.
While the ‘basic building blocks’ used to implement the neural network are well known and widely adopted, added first author and PhD student Ahsan Noor Khan, the method of evoking emotions and its combination with wireless signals for unobtrusive sensing of breathing and heart rate is potentially new and could open doors for new opportunities in wireless emotion detection.
“Being able to detect emotions using wireless systems is a topic of increasing interest for researchers as it offers an alternative to bulky sensors and could be directly applicable in future 'smart' home and building environments,” said Noor Khan. “In this study, we've built on existing work using radio waves to detect emotions and show that the use of deep learning techniques can improve the accuracy of our results."
Comment: How AI and robotics are transforming healthcare
During the study, a group of participants were asked to watch a video selected by researchers for its ability to evoke anger, sadness, joy or pleasure. Whilst the individual watched the video, researchers then emitted harmless radio signals toward the individual and measured the signals that bounced off them. By analysing the changes to these signals, caused by slight body movements, researchers said they could reveal ‘hidden’ information about an individual’s heart and breathing rates.
Previous research has used similar non-invasive or wireless methods of emotion detection, however in these studies data analysis has depended on the use of classical machine learning approaches, whereby an algorithm is used to identify and classify emotional states within the data.
“Based on our results, this technology can detect emotions at an accuracy of 71 per cent,” Avin Ihalage commented. “The experiment was conducted on 15 participants and no female participants were involved. Ideally, a widespread study involving more participants should be performed to evaluate the generalisability of this method. We note that increasing the number of emotions considered could be useful for future applications - again, this requires more data.”
The team believes the technology could have significant implications for the management of health and wellbeing and plan to work with healthcare professionals and social scientists on public acceptance.
Noor Khan said that the current focus is to use the technology primarily for the healthcare sector, adding that while it has potential applications in many other social sectors, the long-term commercialisation of the technology needs to consider the associated ethical concerns, such as privacy breach and data protection.
For future work, he explained that the team aims to evaluate their methods in an uncontrolled environment, where there may be multiple people or moving objects, to realise more practical and real life scenario based emotion detection schemes using deep learning.
Professor Yang Hao, project lead, said that the research could also open up opportunities in areas such as human/robot interaction.
Babcock marks next stage in submarine dismantling project
Surely on a national security project all contractors ought to be UK owned? This is similar to the life enhancement of our nuclear stations which has...