What is our Inspiration

We were inspired to work with people with disabilities, Sydney has a neice that is developmentally disabled and she has always wanted the opportunity to integrate her love for the community and her love for programming so we came together as a team to make that dream a reality.

What we built

Machine learning is a topic of great interest in the tech community for quite some time and is something our team an members have been looking forward to working with. We built a web based application that recognizes sign language from visual input that we taught clarfai and outputs the text and audio corresponding with each letter.

How we built it

We used the Clarfai API to train our models to recognize some letters. We used over 1,000 points of data collected throughout the hackathon from various people as shown in the video. We then submit an image of our hands to the server to process the image and return the letter that bests resembles our models. We then print the text based on a confidentiality percentage with Javascript.

Challenges we ran into

Effectively collecting data points for our various models and being able to parse through the results to retrieve the correct letter.

What we learned

Clarifai API, Neural networks, Artificial intelligence, and image recognition technology.

Share this project:

Updates