MIT's Nexi robot expresses emotions the same way you do - with your highly mobile face.
Nexi's head and face were designed by Xitome Design with MIT. The expressive robotics started with a neck mechanism sporting 4 degrees of freedom (DoF) at the base, plus pan-tilt-yaw of the head itself. The mechanism has been constructed to time the movements so they mimic human speed.
Nexi's face has been designed to use gaze, eyebrows, eyelids and an articulate mandible to communicate a greater range of different emotions.
Nexi has a color CCD in each eye as well as (top this, humans) an indoor Active 3D infrared camera in its head and four microphones to support sound localization.
The chassis for the robot is also advanced; it is based on the uBot5 mobile manipulator developed by the Laboratory for Perceptual Robotics UMASS Amherst. The mobile base can balance dynamically on two wheels; Nexi has what amounts to a Segway-like body. The arms can pick up ten pounds; the plastic covering of the chassis can detect human touch.
Now, Nexi needs to learn to react emotionally, like the Kansei robot created at Meiji University's School of Science and Technology. You should also compare Nexi's range of emotional expression with that of the South Korean EveR2-Muse Robot; she has a more human face. Also, the WD-2 Face Morphing robot uses a much more flexible facial structure.
Take a look at this video of Nexi expressing itself. Via MIT's Nexi Robot; see also the MIT Nexi Mobile Dexterous Social robot website.
(This Science Fiction in the News story used with permission of Technovelgy.com - where science meets fiction)