The UCSD robot watches itself to learn how to pull new facial expressions.
Courtesy of UCSD |
Researchers at the University of California, San Diego (UCSD), who demoed a realistic-looking robot Einstein at the TED Conference last February, have now gone a step farther, infusing the robot with the ability to improve its own expressions through learning.
Previously, the head of the robot--designed by Hanson Robotics--could only respond to the people around it using a variety of preprogrammed expressions. With 31 motors and a realistic skinlike material called Frubber, the head delighted and surprised TED conference goers last winter.
Inspired by how babies babble to learn words and expressions, the UCSD researchers have now given the Einstein-bot its own learning ability. Instead of being preprogrammed to make certain facial expressions, the UCSD robot experiments in front of a mirror, gradually learning how its motors control its facial expressions. In this way, it learns to re-create particular expressions. The group presented its paper last month at the 2009 IEEE Conference on Development and Learning.
According to a press release from the university,
Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot learned to make facial expressions it had never encountered.
Such an expressive robot could be useful as an assistant or teacher, or just as a means of learning more about how humans develop expressions. But a robot that watches itself in a mirror, practicing and improving how it looks, seems like another step into uncanny valley.