We used hand detection to develop a PoC for Sign Language detection, helping our client raise $100.000 in founding.
ELdeS is an Uruguayan project that aims to support and popularize sign language both in their country and worldwide. Their ultimate goal is to achieve a greater inclusion of people with hearing impairment in their environment.
In order to achieve this, they understand that it is vital that we all speak their language. For this reason, they integrate technology into teaching, making learning Sign Language much easier and more fun.
With the use of virtual technology in an innovative way, any student is able to interact with the videos presented in the online courses, by using a webcam and motion detection software.
EldeS wanted to create an interactive platform where the use of technology allows students to make progress on their own by using a system that detects and analyzes the user’s movements through a webcam, giving them instant feedback.
For this stage, we worked hand in hand with the client. They didn’t have any data, so we needed to create our own dataset. We developed an easy way to record and tag the different signs and, with the help of EldeS’s tutors and users, managed to generate enough videos for us to train a model.
For this stage, we worked hand in hand with the client. They didn’t have any data, so we needed to create our own dataset. We developed an easy way to record and tag the different signs and, with the help of EldeS’s tutors and users, managed to generate enough videos for us to train a model.
We developed a distance detection method using a MediaPipe Hand detection model, and coding our own logic on how to recognize gestures based on hand keypoint detection.
These were the ‘Golden Rules’, or the perfect sign, meaning the movement, gesture or position for each sign made exactly as it should be. So, if the sign is similar to the Golden Rule, it is correct.
We used a hybrid solution on which part of the processing is done on the edge, while some other part is done on the cloud. The heavy processing Computer Vision module runs on the browser, and then the hand position information is sent to a cloud server to analyze the movement and detect which gesture is being made.
The implemented solution achieved 90% precision, and this first PoC helped ELdeS access over $100.000 on investing and start the development of the final product, a path we are currently taking together.
“Beyond the fact that we reached a very good result, we value how organized, how transparent and how serious they are”