FACS and Humanoid Robot Design: An Overview

14 Feb
2020
FACS and Humanoid Robot Design: An Overview

One of the significant aspects of humanoid robot design is its emotional impact. This is achieved through intelligent control of the robot’s facial expression. Roboticists are relentlessly trying to merge the gap between the appearance and functionality of humans and robots.

FACS or Facial Action Coding System is a useful tool in this regard as it guides the researchers to develop a robot that can simulate human expressions.

Early Runners in the Field

As technology is rapidly evolving, the demand for humanoid robots is overpowering that for traditional robots.

Some of the famous humanoid robots include Kismet made from MIT in the USA, SAYA and WE-robot made in Japan, and ROMAN made in Germany.

There are two different reasons behind the rising popularity of such an “emotional” humanoid robot design. The human audience finds it an amazing machinery and likes to see its actions such as anger and smile. Another set of the audience prefers to play with toys that exhibit such human gestures and facial expressions.

Facial Action Coding System (FACS)

Facial Action Coding System (FACS) was first developed by Friesen and Ekman in 1978 to measure facial activities during behavioral science investigation of the face.

According to FACS, humans, regardless of culture, ethnicity, and country, share seven distinct emotional facial expressions. These include happiness, sadness, surprise, fear, anger, contempt, and disgust. That’s why roboticists take inspiration from FACS.

Under the FACS, researchers divide humanoid robot head into six essential parts – neck, mouth, mandible, eyebrow, eyelid, and eyeball.

Design of Robot Head

The design of a robot head can be divided into two parts: the intelligent control system that belongs to software work; and a motor control system that belongs to the mechanical work.

To put it simply, intelligent control system is the soul of the robot as it helps the robot create humanoid emotion. On the other hand, the mechanical structure is the emotion carrier that can change the emotional information to action command.

Let’s briefly discuss these in the following passages:

  • Intelligent Control System

If a robot has to show its “feelings”, it’s not enough to analyze only the external environment and its own emotional condition. Proper understanding of human language, facial expressions, and body movements are also required while modeling the emotional state of the robot.

  • Mechanical Structure

A robot with artificial emotion has to carry out a range of body language gestures and facial expressions. The mechanical structure is a crucial part to make the expressions appealing and human-like.

A synthesis of multiple features forms a single expression. For instance, when someone is surprised, his/her eyebrows lift and skin stretches. Once the eyes are wide open, the eyelid gets pushed up, and the mouth is open when the chin of the face falls. Such synchronization of multiple features can deliver a proper humanoid robot design.
Conclusion:

Day by day, humanoid robots are becoming more life-like. Are we about to witness a time when we cannot distinguish between a robot and a human! Time will tell.

For more such fascinating information and custom robotic solutions, contact Custom Entertainment Solutions. Call 01.801.410.4869 today.

Leave a Reply

Your email address will not be published. Required fields are marked *