Student in India develops AI model that turns sign language to English

According to Priyanjali, her newly developed AI-powered model was inspired by data scientist Nicholas Renotte’s video on Real-Time Sign language Detection. She invented the AI model using Tensorflow object detection API that translates hand gestures using transfer learning from a pre-trained model named ssd_mobilenet.

According to Priyanjali, her newly developed AI-powered model was inspired by data scientist Nicholas Renotte’s video on Real-Time Sign language Detection. She invented the AI model using Tensorflow object detection API that translates hand gestures using transfer learning from a pre-trained model named ssd_mobilenet.

She also mentioned that building a deep learning model dedicated to sign language detection is quite challenging and believes that the open-source community will find a solution soon. She further said that it might be possible to build deep learning models solely for sign languages in the future.

Earlier in 2016, two students from the University of Washington named Thomas Pryor and Navid Azodi invented a pair of gloves called ‘SignAloud’, which could translate sign language into speech or text.

They won the Lemelson-MIT competition for their entry of SignAloud.

 

Original post: https://www.nationthailand.com/international/40012554

Leave a Reply

Your email address will not be published. Required fields are marked *