Inspiration

I was inspired by my experiences working in my local community. I met many members of the deaf community and wanted to make an app that would help others to join this community.

What it does

Sign uses the webcam to allow users to make the signs with their hands and receive feedback, as well as keep track of their progress and score.

How I built it

I built it using React, Node, Express, handtrack.js, and Google Cloud ML Vision for the signs. I made my own dataset of 100 images and trained the models.

Challenges I ran into

I had trouble implementing the Google API and eventually ran out of time.

Accomplishments that I'm proud of

I am really proud of the app that I made. I wanted to take this weekend to learn a lot about backend engineering and I feel like I accomplished my goal. I also learned a lot about Google Cloud and about the process of Machine Learning.

What's next for Sign

I would like to keep working on the app and eventually deploy it so that others can use it to learn Sign Language!

Built With

Share this project:

Updates