Humanoid robot trained for improved human-robot interactions

A humanoid robot has been trained to demonstrate a variety of expressive movements, an advance that could improve human-robot interactions.

Engineers at the University of California San Diego trained the humanoid robot with simple dance routines and gestures - such as waving, high-fiving and hugging - while maintaining a steady gait on diverse terrains.

“Through expressive and more human-like body motions, we aim to build trust and showcase the potential for robots to co-exist in harmony with humans,” Xiaolong Wang, a professor in the Department of Electrical and Computer Engineering at the UC San Diego Jacobs School of Engineering, said in a statement. “We are working to help reshape public perceptions of robots as friendly and collaborative rather than terrifying like The Terminator.”

Wang and his team will present their work in a paper at the 2024 Robotics: Science and Systems Conference, which takes place between July 15 to 19 in Delft, Netherlands.

The humanoid robot has been trained on a diverse array of human body motions, enabling it to generalise new motions and mimic them.

To train their robot, the team used a collection of motion capture data and dance videos. Their technique involved training the upper and lower body separately. This approach allowed the robot’s upper body to replicate various reference motions, such as dancing and high-fiving, while its legs focused on a steady stepping motion to maintain balance and traverse different terrains.

The team said that despite the separate training of the upper and lower body, the robot operates under a unified policy that governs its entire structure.

“The main goal here is to show the ability of the robot to do different things while it’s walking from place to place without falling,” said Wang.

Simulations were first conducted on a virtual humanoid robot and then transferred to a real robot. The robot demonstrated the ability to execute learned and new movements in real-world conditions.

Currently, the robot’s movements are directed by a human operator using a game controller, which dictates its speed, direction and specific motions. The team envisions a future version equipped with a camera to enable the robot to perform tasks and navigate terrains all autonomously.

The team is now focused on refining the robot’s design to tackle more intricate and fine-grained tasks.

“By extending the capabilities of the upper body, we can expand the range of motions and gestures the robot can perform,” said Wang.