A robot developed at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning.
The UC San Diego researchers used machine learning to empower their robot, which resembles Einstein, to learn to make the realistic facial expressions.
The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers from the Machine Perception Laboratory are studying the face and head of their robotic Einstein to find ways to automate the process of teaching robots to make lifelike facial expressions.
This Einstein robot head has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string. Today, a highly trained person must manually set up these kinds of realistic robots so that the servos pull in the right combinations to make specific face expressions. To begin to automate this process, the UCSD researchers looked to both developmental psychology and machine learning.
Developmental psychologists speculate that infants learn to control their bodies through systematic exploratory movements, including babbling to learn to speak. Initially, these movements appear to be executed in a random manner as infants learn to control their bodies and reach for objects.
’We applied this same idea to the problem of a robot learning to make realistic facial expressions,’ said Javier Movellan, the director of UCSD’s Machine Perception Laboratory.
Although their preliminary results are promising, the researchers note that some of the learned facial expressions are still awkward. One potential explanation is that their model may be too simple to describe the coupled interactions between facial muscles and skin.
To begin the learning process, the UC San Diego researchers directed the Einstein robot head to twist and turn its face in all directions, a process called ’body babbling’.
During this period the robot could see itself on a mirror and analyse its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.
Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot was able to make facial expressions it had never encountered.
For example, the robot learned eyebrow narrowing, which requires the inner eyebrows to move together and the upper eyelids to close a bit to narrow the eye aperture.
During the experiment, one of the servos burned out due to misconfiguration. So the researchers ran the experiment without that servo, only to discover that the model learned to automatically compensate for the missing servo by activating a combination of nearby servos.
While the primary goal of the work was to solve the engineering problem of how to approximate the appearance of human facial muscle movements with motors, the researchers say it could also lead to insights into how humans learn and develop facial expressions.
The Einstein robot head at UC San Diego performs asymmetric random facial movements as a part of the expression learning process
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...