Technology has revolutionized the way we communicate, breaking down barriers across languages and enabling greater connection and understanding, ultimately leading to more inclusion and greater accessibility.
Live audio transcription and translation tools have limitations for the deaf and hard-of-hearing due to sign language's complex combination of fast-paced hand gestures, facial expressions, and full body movements. While machine learning models can handle facial expressions and body movements, detecting hand and finger movements remains a challenge.
With the current technology we are able to detect simple hand gestures - but in order to be able to understand the full spectrum of Sign Language we need better hand and finger detection.
How do we make sure that the deaf and hard of hearing gets equal access to language tools? And how can we help accelerate the development of better hand and finger detection - so we, in the end, can create these tools?
Launched to celebrate the American Sign Language Day (April 15th), GiveAHand.ai is using tech for good. One hundred percent crowdsourced, the data collected in the platform will generate a diverse dataset of hands: diverse shapes, colors, backgrounds and gestures.
So that, anyone can put their hands to good use, by contributing and uploading images, helping to build an image library that will help unlock sign language. Researchers can then download and use these fully tagged images to improve their machine learning models that will allow the detection and translation of the full spectrum of Sign Language.
GiveAHand.ai was developed alongside the American Society for Deaf Children. Its aim is to create an extensive dataset/database of hands, fully tagged with 3D keypoints.
The website is designed to be incredibly simple for both those lending a hand, and those who want access to our dataset. Adding your hand is as simple as taking a picture on your webcam and instantly uploading it - it requires nearly no explanation, and takes less than a minute to complete.
Once the hands are uploaded, they are added to the image library and tagged to allow researchers and developers access to a diverse collection of hands to improve machine learning.
“Donating” a hand to our dataset is one of the easiest ways to help create accessibility of sign language for all, and can be done from anywhere, by anyone - which allows for a fully inclusive and varied dataset to be created.
The American Society for Deaf Children aims to empower diverse families with deaf/hard-of-hearing children and youth by embracing full access to language-rich environments through mentoring, advocacy, resources, and collaborative networks. With GiveAHand.ai we want to build on this commitment to give these families equal access to language tools.
We launched on American Sign Language day - 15th of April, and despite having no media budget, we have reached conversions far beyond what we projected. On average each visitor contributed a hand, which means 1 visitor equals 1 hand. And, within the first 5 days we are already ⅓ of the way towards becoming the world’s largest fully manually tagged, and open, dataset for finger and hand detection.
AI researchers are already looking for ways where they can use this data to improve their current machine learning models.
As the dataset grows, so does the opportunity for creating better hand models, and to create a sustainable solution to overcoming the sign language barrier for good.