Inspired by stories about our grandparents. We all have had experiences when the elderly people in our lives become isolated in group settings because they couldn't understand what was being said. We wanted to create a product that would help them in understanding the conversations around them and would thus foster greater interaction with their loved ones.

What it does utilizes augmented reality to supplement conversations with closed captioning. We want those with hearing disabilities to be able to track group conversations in real time, so will use voice fingerprinting in order to delineate between speakers.

How we built it

Design: We first gathered data about deafness from scientific journals to inform our design, we then went into competitive analysis, persona creation, user journeys and sketching

Development: We built using Magic Leap, Unity and C#

Challenges we ran into

We were sharing the magic leap with another team, which was a blocker. Ideally we would use image recognition and voice fingerprinting, but because of the constraints of time, we were unable to.

Accomplishments that we're proud of

We are proud of having a demo which highlights the social progress that can be made using AR technology!

What we learned

We learned a lot about the correlation between hearing loss and depression. We believe that AR can be leveraged in order to help individuals facing hearing loss better engage with the people in their world.

What's next for

We would like to focus on image recognition and voice fingerprinting, so the user could be in a group setting and get closed captioning of the conversations happening around them. Speech bubbles would correspond spatially to the person speaking. We would also like the user to be able to focus on one persons speech through gesture (moving their head towards a speaker may emphasize that persons speech bubble).

Built With

Share this project: