Inspiration
People generally expect deaf people to write/type out messages all the time to communicate with us, but why can't we accommodate them? Our group has created “Signlingo” in hopes of normalizing the usage of sign language, which is used by more than 70 million deaf people.
What it does
Signlingo uses Deep Learning to translate pictures of ASL into English, which we hope can help others understand sign language users better.
How we built it
We used PyTorch for the Deep Learning aspect and Google Colab for collaboration.
Challenges we ran into
We had difficulties making the ASL translating model and training it with PyTorch, but thanks to some help we received from the mentors, we got our model to run and learned a lot about using classes and objects on PyTorch. Also, due to lack of time, we could not further refine the website to include the features of our prototype.
Accomplishments that we're proud of
We’re proud of making a functional ASL translator with Deep Learning, designing a beautiful website prototype, and developing a basic website for Signlingo.
What we learned
We explored how to make classes, objects, lists, and dictionaries with Python, how to develop a basic website using HTML and CSS, along with how to effectively use Figma to make attractive designs.
What's next for Signlingo
We hope to add more ASL phrases and potentially create a sustainable web server with features that will implement leaderboard and achievement features to encourage users to use our translator more, and a dashboard that will allow users to review past translations. We also hope to make translation performance faster and smoother!
Log in or sign up for Devpost to join the conversation.