Inspiration

One of the mentors mentioned an idea about helping the deaf experience concerts in a better way. We quickly searched for any existing solutions, and talked with other mentors through which we realized there are very few solutions that exist. However, due to the lack of availability of hardware components, we were forced to change/pivot our idea. That's where this idea came in. We wanted to enhance video or movie watching experience for the deaf, and we wanted a portable solution that helps them be immersed in what they watch.

What it does

We decided that the best way to help them be immersed in what they're watching is through feeling. Combining visual cues and emulating what that would feel like was our goal. We ended up doing just that, using vibration sensors, we can match what's on the screen.

How I built it

We used 4 vibration sensors placed all around a hat (because its comfortable to wear and we happened to have one with us), and connected them to an Arduino board. Our next step was injecting metadata into a video that would tell our hat how to behave.

Challenges I ran into

Building the hardware was our biggest challenge

Accomplishments that I'm proud of

The biggest accomplishment is going from a great idea to a tangible prototype

Built With

Share this project:
×

Updates