Inspiration
After watching and being impressed with the ASL interpreters at the opening ceremony, our team realized that we, as hearing individuals, couldn't understand what they were signing. It made us think about how easily hearing people can overlook communication barriers, so we did some research: ASL is the primary mode of communication for 500,000 people in the United States, but only 2% of Americans speak it. One cause of this gap is that learning it without a teacher can be challenging. And unlike spoken languages, few online platforms offer real-time feedback.
What it does
Excel ASL bridges this gap by expanding access to ASL education and enabling learners to improve their signing accuracy from the comfort of their own home. After logging into their account (to save progress) users have the option to Learn, Practice, or Test. These three features together help the user familiarize themselves with the ASL language, work on muscle memory, and then test proficiency and track their progress.
How We Built It
We built a React-based web app with a Python backend using OpenCV for webcam input and MediaPipe for hand tracking. FastAPI powers gesture detection, while Supabase handles secure login and progress tracking. We collaborated via Git and GitHub for version control.
Challenges We Ran Into
We initially tried building with React and Next.js, but integrating OpenCV in JavaScript caused setup issues. Switching to Python solved most problems, although installations were extremely slow, and we had to deal with different operating system conflicts. Of course, as with every collaborative project, we also ran into merge errors and conflicts, but resolved them through teamwork. We also faced hurdles saving user progress data, which we are still having some trouble with, and will definitely be our first priority after the event.
Accomplishments that we're proud of
We are so proud of ourselves for getting the integrated webcam and OpenCV to run seamlessly! In addition, our UI designer outdid herself, and we love the feel and the look of the website. This was a significant step for us, as two out of three team members had never worked with OpenCV or React, and none of us had experience with hand-tracking software. We cannot wait to see where this project goes and utilize the skills we developed over the past weekend for even more incredible opportunities!
What we learned
We learned how to integrate computer vision into a real web application by combining OpenCV, Mediapipe, FastAPI, React, and Supabase, and we gained a much deeper understanding of how full stack systems work together. Through building an accessibility focused tool, we also learned how important thoughtful design and clear user interaction are when creating something meant to support beginners in ASL. Most of all, we learned how powerful technology can be when paired with empathy, teamwork, and a genuine commitment to inclusivity.
What's next for Excel ASL
Looking ahead, we plan to expand beyond the alphabet category by adding a numbers, family, and work/professional category as well. To expand on the alphabet, we want to add vocabulary lessons, animated sign demonstrations, streaks, more practice modes, and eventually a custom-trained ASL model for improved accuracy. Our long-term goal is to make ASL learning widely accessible so more people can connect meaningfully with Deaf individuals in their everyday lives.
Log in or sign up for Devpost to join the conversation.