Back to Case Studies
Hand gesture recognition

Hand gesture recognition in sign language

We used hand detection to develop a PoC for Sign Language detection, helping our client raise $100.000 in founding.

CLIENT
EldeS

ELdeS is an Uruguayan project that aims to support and popularize sign language both in their country and worldwide. Their ultimate goal is to achieve a greater inclusion of people with hearing impairment in their environment.

In order to achieve this, they understand that it is vital that we all speak their language. For this reason, they integrate technology into teaching, making learning Sign Language much easier and more fun.

With the use of virtual technology in an innovative way, any student is able to interact with the videos presented in the online courses, by using a webcam and motion detection software.

EldeS

Solution

Challenge

What was needed

EldeS wanted to create an interactive platform where the use of technology allows students to make progress on their own by using a system that detects and analyzes the user’s movements through a webcam, giving them instant feedback.

The AI solution needed to be able to:
Correctly identify the different sign language exercises.
Identify signs and different gestures made by hand.
STAGE 1

Data acquisition

For this stage, we worked hand in hand with the client. They didn’t have any data, so we needed to create our own dataset. We developed an easy way to record and tag the different signs and, with the help of EldeS’s tutors and users, managed to generate enough videos for us to train a model.

STAGE 1

Data acquisition

For this stage, we worked hand in hand with the client. They didn’t have any data, so we needed to create our own dataset. We developed an easy way to record and tag the different signs and, with the help of EldeS’s tutors and users, managed to generate enough videos for us to train a model.

STAGE 2

Detection code

We developed a distance detection method using a MediaPipe Hand detection model, and coding our own logic on how to recognize gestures based on hand keypoint detection.

These were the ‘Golden Rules’, or the perfect sign, meaning the movement, gesture or position for each sign made exactly as it should be. So, if the sign is similar to the Golden Rule, it is correct.

STAGE 3

STAGE 3
STAGE 4

Deploy

We used a hybrid solution on which part of the processing is done on the edge, while some other part is done on the cloud. The heavy processing Computer Vision module runs on the browser, and then the hand position information is sent to a cloud server to analyze the movement and detect which gesture is being made.

No items found.

Impact

The implemented solution achieved 90% precision, and this first PoC helped ELdeS access over $100.000 on investing and start the development of the final product, a path we are currently taking together.

90%

precision reached

800

videos from 80 different signs

“Beyond the fact that we reached a very good result, we value how organized, how transparent and how serious they are”

Martín Curzio, Co-Founder of ELdeS
Seeking deeper insights?
Explore more of our case studies
Explore Our Case Studies