Dubbed RF-Pose, the latest advance from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) uses AI to teach wireless devices to sense people’s postures and movement, even from the other side of a wall.
Led by CSAIL’s Prof Dina Katabi, the team has used a neural network to analyse radio signals that bounce off people’s bodies in order to create a dynamic stick figure that walks, stops, sits and moves its limbs as the person performs those actions.
The system could be used to monitor diseases like Parkinson’s and multiple sclerosis by providing a better understanding of disease progression and allowing doctors to adjust medications accordingly. It could also help elderly people live more independently, while providing the added security of monitoring for falls, injuries and changes in activity patterns.
The team is currently working with doctors to explore multiple applications in healthcare.
“A key advantage of our approach is that patients do not have to wear sensors or remember to charge their devices,” said Katabi.
MIT CSAIL RF-Pose could also be used for new classes of video games where players move around the house, or even in search-and-rescue missions to help locate survivors.
The research is detailed in a paper titled: Through-wall human pose estimation using radio signals
Oxa launches autonomous Ford E-Transit for van and minibus modes
I'd like to know where these are operating in the UK. The report is notably light on this. I wonder why?