To this end, researchers at Columbia University have tasked robots with teaching themselves about the structure of their own bodies and how they move by watching their motions with a camera.
With this knowledge, the robots could plan their own actions and overcome damage to their bodies.
"Like humans learning to dance by watching their mirror reflection, robots now use raw video to build kinematic self-awareness," said study lead author Yuhang Hu, a doctoral student at the Creative Machines Lab at Columbia University. "Our goal is a robot that understands its own body, adapts to damage, and learns new skills without constant human programming."
Most robots first learn to move in simulations. Once a robot can move in these virtual environments, it enters the physical world where it continues to learn. “The better and more realistic the simulator, the easier it is for the robot to make the leap from simulation into reality,” said Hod Lipson, James and Sally Scapa Professor of Innovation and chair of the Department of Mechanical Engineering.
However, creating a good simulator can be an arduous process that typically requires skilled engineers. The researchers taught a robot how to create a simulator of itself by watching its own motion through a camera.
“This ability not only saves engineering effort, but also allows the simulation to continue and evolve with the robot as it undergoes wear, damage, and adaptation,” said Lipson.
In the new study, the researchers developed a way for robots to autonomously model their own 3D shapes using a 2D camera. This breakthrough was driven by three deep neural networks that inferred 3D motion from 2D video, enabling the robot to understand and adapt to its own movements. The new system could also identify alterations to the bodies of the robots, such as a bend in an arm, and help them adjust their motions to recover from this simulated damage.
Such adaptability might prove useful in a variety of real-world applications. "Imagine a robot vacuum or a personal assistant bot that notices its arm is bent after bumping into furniture," said Hu. "Instead of breaking down or needing repair, it watches itself, adjusts how it moves, and keeps working. This could make home robots more reliable - no constant reprogramming required."
Another scenario might involve a robot arm getting knocked out of alignment at a car factory. "Instead of halting production, it could watch itself, tweak its movements, and get back to welding - cutting downtime and costs," said Hu. "This adaptability could make manufacturing more resilient."
“Robots need to learn to take care of themselves, if they are going to become truly useful,” said Lipson. “That’s why self-modelling is so important.”
The researchers detailed their findings on February 25, 2025 in Nature Machine Intelligence.
Comment: Industry must prioritise environmentally responsible adoption of Gen AI
Industry needs to develop the application of AI by all legal and economic means possible. Big-Brother needs to be kept out or we will have a...