Inspiration
I sometimes cannot understand what people are saying in Zoom and I wish that I could have transcription automatically. I have no diagnosed auditory issues, so I can only assume it would be worse for people with hearing disabilities.
What it does
We revamped Zoom to be more friendly for those hard of hearing or have auditory processing issues. We have automatic transcription to promote active inclusivity. Additionally, transcription can be annotated, edited, and augmented with text so people with hearing disabilities can have the best experience. Other prominent features include setting volumes for individuals, notifying people by speaking their name, and messaging speakers when confused.
How we built it
We used Figma, paper and pens.
Challenges we ran into
We wanted to make sure we gave a comprehensive UI/UX so people with hearing disabilities can use our application easily.
Accomplishments that we're proud of
This is my first time using Figma for an official project, and I am really proud of how it came out.
What we learned
We learned how to make professional graphics in Figma.
What's next for Whoosh
Additional features we would love to include in the future include an artificial intelligence that can watch ASL (American Sign Language) and transcript it just like any verbal contribution to a conversation.
Built With
- figma
Log in or sign up for Devpost to join the conversation.