Microsoft Surface had inspired me to use technology to empower people. So I wanted to use technology that would help individuals to empower themselves. There is 1 mute in every 1250. How do they communicate? It is difficult for us to learn Sign Language. So what shall we do?

I used Microsoft Kinect to recognize the gestures involved in Sign Language. I trained the Kinect to detect eligible gestures. This way we can easily get to know the signs being used by mute by following Kinect.

The Kinect dataset is huge. Be it the storage or be it the processing power, the machine learning requires great computing power. Hence I used Microsoft Azure cloud platform to get the processing power from the Virtual Machines and to store the data in the Storage.

Thus I used the Kinect hardware and Azure cloud platform to empower our brothers. I hope this work will be carried forward and better solutions would be found in this direction.

I thank the Microsoft team who supported throughout the Hackathon.

Built With

Share this project: