Developing Animatronic Hand with Gesture-based Interaction Capabilities

13 Apr
2020
Developing Animatronic Hand with Gesture-based Interaction Capabilities

Animatronics involve the utilization of electro-mechanical devices to develop a system that replicates the human body movements. With the rapid demand of animatronic hand in the entertainment industry, that can operate with superior precision and accuracy, the roboticists are using an efficient robotic hand controller unit in order to develop an interactive medium between a human and an animatronic hand using sign gestures.

To develop the software and hardware for a computer to understand human hand gestures and ASL, it is very handy to have a hand like the DYN HAND GEN 3 that has 13 Degrees of Freedom. With this hand’s repeatability of gestures, the engineers can move it quickly during the design and testing phases of the human-machine software and hardware.

This idea led to the development of an animatronic hand that is visibly identical to a human hand and can perform better than the latter by means of the Image Processing approach.

Overview

The motive behind human gesture recognition and modelling is to transform gestures into a message. This inspired the roboticists to develop an animatronic hand than can play as a substitute for a human hand in an environment where the latter is not present.

But recognition and modelling of gestures involve several adverse challenges due to the variations in shape and size. Human gestures are usually categorized into Stationary Gestures and Non-stationary Gestures. The former is a fixed image posture and position, represented by single images while the latter is represented with a series of images.

To develop such an animatronic hand, first, the scientists need to build a human-robot interactive controller that utilizes several human hand gestures and relates these to the suitable actions of the robotic hand.

To accomplish the ulterior motive, sign language, computer graphics, and automated robots are used. One of the latest methods of gesture recognizing and modelling uses flex sensors and employs wireless communication. This gesture modelling involves three distinct criteria – hand orientation, hand position, and hand movement. Further, the flex sensors calculate the amount of bend a human finger can produce and transmit these to the robotic controller unit.

  • Design Methodology

The design methodology of any animatronic hand developed using image processing includes multiple stages. Some of these are as follow:

  • Image Processing

Input gestures are captured by utilizing a webcam and images are acquired through MATLAB programmed with image acquisition tools.

  • Image Acquisition

The video input is acquired and subsequently processed during this stage. The captured gesture images are processed and transfigured into a suitable configuration and then coded accordingly.

  • Pre-Processing

This stage improves the image parameters that have the potential to overpower any unwanted distortion or augment the image characteristics required for further processing stages.

  • Noise Reduction

This step ensures to remove any unwanted background noise.

  • Resizing

During this step, the acquired image is resized according to the required image size parameters.

  • Pruning

In this morphological process, unwanted objects are removed from the image.

Conclusion

As you can see, various stages are involved in converting human hand gestures into the movement of an animatronic hand and it simulates a real-life human hand. Not only that, such a robotic hand can perform even better than a human hand.

To learn more about an animatronic hand or to purchase one, you may contact the professionals at Custom Entertainment Solutions. Call 01.801.410.4869 today!

Leave a Reply

Your email address will not be published. Required fields are marked *