Inspiration

Communication is a fundamental human right, yet many individuals face barriers due to hearing impairments or language differences. We were inspired to create a solution that bridges this gap by translating American Sign Language (ASL) hand signs into text. Additionally, we recognized that emotions play a crucial role in effective communication, so we integrated facial expression decoding to capture the emotional nuances often missed in text-based interactions.

What it does

Our project is a real-time application that interprets ASL hand signs captured through a camera and translates them into written text displayed on the screen. Simultaneously, it analyzes facial expressions to detect and display the user's emotions. This dual functionality ensures that both the literal message and the emotional context are conveyed, facilitating more comprehensive and empathetic communication.

How we built it

Challenges we ran into

We quickly ran into problems with ASL sign classification, for which we took multiple approaches. We tested different models, such as ChatGPT4o, Google Gemeni, and Groq and found that gemeni was the best at classification. Additionally, We tried multiple methods of sending in data for sign classification, such as sending in raw images, sending in landmark coordinates from mediapipe, or combinations of both.

Accomplishments that we're proud of

We're proud to have developed a functional prototype that effectively translates ASL into text and accurately detects a range of emotions from facial expressions in real-time. Overcoming technical hurdles to integrate these complex systems was a significant achievement. Additionally, we're proud that our project has the potential to make communication more accessible and empathetic for underrepresented communities.

What we learned

This project taught us how to effectively utilize the Hume, Groq, and Gemeni APIs as well as delegating tasks to play to our teammate's strengths. We also learned how to connect all of these subroutines using flask so that we could have all of our service localized.

What's next for Human Underrepresented Emotions

Moving forward, we plan to expand our ASL database to include more signs and incorporate other sign languages to reach a broader audience. Currently, it only supports individual letters as signs, but we hope to add gestures with greater meanings into the range of translatable signs. We'll work on improving the emotion detection model to recognize a more extensive range of emotions and cultural expressions. Additionally, we aim to develop a user-friendly interface and explore integration with assistive technologies, making "Human Underrepresented Emotions" a comprehensive tool for bridging communication gaps and fostering emotional understanding.

Built With

Share this project:

Updates