Home Engineering Projects For You 1001+ Electronics Projects For You

Sign Talk: A Gesture Language Translator

Ashwini Kumar Sinha

457
Gesture Language Translator

Communicating with a specially challenged person who can’t speak or hear is quite difficult, especially when you don’t know sign language.

So to ease this problem, today you will learn to make a Gesture Language Translator device that converts sign language into a spoken language. This device will be based on an ML model that can recognize the different sign language gestures for accurate translation.

Gesture Language Translator Prototype
Gesture Language Translator Prototype
Fig 1. Sign language translating

Bill Of Material  

Preparing The ML Model 

There are different sign languages practised in different countries. For example, India uses Indian Sign Language (ISL) while the USA uses American sign language (ASL). So, you first need to decide which type you wish to implement. 

To obtain the datasets of the ASL, visit here. For ISL, visit here. You can search here for numerous hand gestures that are used in daily life. 

After gathering the datasets, prepare the ML model for training. There are many options for this such as TensorFlow, Keras, PyTorch, Google’s Teachable Machine etc. For this project, I am using Google’s Teachable Machine, which is an online ML model creator service. Now, feed the datasets into your choice of ML model creator and capture the pictures of different hand gestures/signs with a camera. Remember to label them as per their meaning. 

Fig 2.ML model feeding datasets

Deploying ML Model 

After carrying out the above steps, download the ML model output. Or you can get the option for uploading the ML model. So upload the code and you will get the link of ML model and a code snippet to be used in the ML model, which can be used for various platforms like Python using Keras P5.js or JavaScript (JS). After uploading the ML model, copy the JS code script.

Coding

Open the Raspberry Pi desktop and then create a new JS file. Shown below is Gesturetranslator.html. Paste the code snippet into the code and write the project name. Increase the canvas size so that the camera feed is clearly visible in a larger space. Save it.

Fig 3.Coding
Fig 4.Code snippet

Testing

Enable the camera interface in Raspberry Pi and plug it into the board. Now open the JavaScript code in Google Chrome or any other browser that supports it.

Fig 5.
Fig 6. Testing sign language translator device

Download ML Model Datasets and Code

NO COMMENTS

SHARE YOUR THOUGHTS & COMMENTS

Please enter your comment!
Please enter your name here