Inspiration
Soccer commentary is full of energy and emotion, but it is not accessible to people who are Deaf or Hard-of-Hearing. I wanted to create something that keeps the excitement of commentary alive for everyone, ensuring no one misses out on the joy of the game.
What it does
Project translates soccer commentary into sign language type subtitles. It synchronizes the emotions and intensity of the commentary with the video, making it accessible to Deaf and Hard-of-Hearing fans.
How we built it
I used FastAPI for the backend, Google Cloud Speech-to-Text for transcription, and Modus for wrapping Llama 3.1 B for advanced NLP normalization. Dgraph is used as a knowledge graph to store relationships between normalized soccer terms and sign language animations. The front-end was created using React Vite.
Challenges we ran into
It was challenging to match the intensity and emotions of the commentary with the animations. There are very few resources which handle or provide any support for direct text phrases to direct American Sign Language so I had to pivot to using signs for letters for ASL.
Accomplishments that we're proud of
proud of creating a system that successfully combines AI, knowledge graphs, and sign-language animations to make soccer commentary accessible. Building a cpp library to overlay the sign language subtitles at proper time stamps was a big challenge and very rewarding.
What we learned
Learned how to leverage cloud services and AI models to perform inference on transcriptions, and video manipulation.
What's next for Soccer-Commentary-DHH
For this hackathon the scope was prove the concept and narrow the focus on just Soccer. In the future, I wanna expand for more Sign Language animations and expand to more sports commentary.
Log in or sign up for Devpost to join the conversation.