Inspiration

We were inspired by the idea of providing the visually impaired with a way to experience the images we encounter on a day-to-day basis, using audio.

What it does

EarMersion is a webapp that converts images into ambient soundscapes. This allows the visually impaired to experience images through audio.

How I built it

We used Microsoft's Vision API to label objects in the images and then used node.js, python and bootstrap to server a compilation of sound effects found on the web that fits with the image

Challenges I ran into

We initially had issues with the website we were using in order to provide sound effects that would create the atmosphere of the photos uploaded by the user as they were sometimes irrelevant to the tags from the photo

Accomplishments that I'm proud of

A working product; that produces great results. Also the experiences of programming in a team.

What I learned

Learning how to use windows cognitive API, web scraping, CSS with bootstrap.

What's next for EarMersion

More ease of use points to further help the visually impaired, for example by adding a text to speech function, but also to try to make it easier to use for other disabilities.

Share this project:

Updates