According to neuroengineer Silvestro Micera, the study shows that extra arms can be extensively controlled and that simultaneous control with both natural arms is possible. Micera and team have reported their findings in Science Robotics.
The study is part of the Third-Arm project that aims to provide a wearable robotic arm to assist in daily tasks or to help in search and rescue. Micera, Bertarelli Foundation Chair in Translational Neuroengineering at EPFL, Switzerland, and professor of Bioelectronics at Scuola Superiore Sant’Anna, Italy, believes that exploring the cognitive limitations of third-arm control may provide gateways towards better understanding of the human brain.
“The main motivation of this third arm control is to understand the nervous system,” Micera said in a statement. “If you challenge the brain to do something that is completely new, you can learn if the brain has the capacity to do it and if it’s possible to facilitate this learning. We can then transfer this knowledge to develop, for example, assistive devices for people with disabilities, or rehabilitation protocols after stroke.”
“We want to understand if our brains are hardwired to control what nature has given us, and we’ve shown that the human brain can adapt to coordinate new limbs in tandem with our biological ones,” said Solaiman Shokur, co-PI of the study and EPFL senior scientist at EPFL’s Neuro-X Institute. “It’s about acquiring new motor functions, enhancement beyond the existing functions of a given user, be it a healthy individual or a disabled one. From a nervous system perspective, it’s a continuum between rehabilitation and augmentation.”
To explore the cognitive constraints of augmentation, the researchers first built a virtual environment to test a healthy user’s capacity to control a virtual arm using movement of his or her diaphragm. They found that diaphragm control does not interfere with actions like controlling one’s physiological arms, speech or gaze.
In this virtual reality setup, the user is equipped with a belt that measures diaphragm movement. Wearing a virtual reality headset, the user sees three arms: the right arm and hand, the left arm and hand, and a third arm between the two with a symmetric, six-fingered hand.
“We made this hand symmetric to avoid any bias towards either the left or the right hand,” said Giulia Dominijanni, PhD student at EPFL’s Neuro-X Institute.
In the virtual environment, the user is then prompted to reach out with either the left hand, the right hand, or in the middle with the symmetric hand. In the real environment, the user holds onto an exoskeleton with both arms, which allows for control of the virtual left and right arms. Movement detected by the belt around the diaphragm is used for controlling the virtual middle, symmetric arm. The setup was tested on 61 healthy subjects in over 150 sessions.
“Diaphragm control of the third arm is actually very intuitive, with participants learning to control the extra limb very quickly,” said Dominijanni. “Moreover, our control strategy is inherently independent from the biological limbs and we show that diaphragm control does not impact a user’s ability to speak coherently.”
The researchers also successfully tested diaphragm control with a simplified robotic arm consisting of a rod that can be extended out, and back in. When the user contracts the diaphragm, the rod is extended out. In an experiment similar to the VR environment, the user is asked to reach and hover over target circles with her left or right hand, or with the robotic rod.
“Our next step is to explore the use of more complex robotic devices using our various control strategies, to perform real-life tasks, both inside and outside of the laboratory. Only then will we be able to grasp the real potential of this approach,” said Micera.
UK startup Wild Hydrogen promises carbon negative H2
They expect you to state you have read and accept their membership terms without anywhere they can be read. There are no contact details or any...