Inspiration - Better quality of life for hearing impaired with improved communication and social understanding.

What it does - Helps hearing impaired follow along in conversations using AR that displays real-time captioning on the lens of the Magic Leap One. Our solution is unique because we will include sentiment and emotional analysis, language translation, and facial recognition as part of the language analysis. Caption Everything will also work great for people that have problems with understanding social cues and body language.

How we built it - We used Unity SDK for easy implementation of Watson and Magic Leap One for fast compiling. We used IBM Watson to process the speech to text and we used magic leap we to display the information in AR.

Challenges we ran into - Surprisingly we were unable to hunt down a Unity or even c# developer for the competition which forced us to learn Unity, learn a new programming language, and build on a cutting edge software we'd never used on the spot.

Accomplishments that we're proud of - We did it! It was a long arduous learning curve but we surmounted it and created something that will be good for humanity.

What we learned - Unity is not easy. SDK's work very well in isolation but become more complex as you stack them. Magic Leap One is actually one of the most compelling XR devices around.

What's next for Caption Everything - Hopefully a win and improving the software for mass release improving the lives of one of the largest communities in the world.

Share this project:
×

Updates