I met Yousef, A blind student at the University of Michigan when he asked me for help getting to his class. There was an event on campus with tents and vehicles obstructing the usual pathways between classes and Yousef couldn't navigate the new environment without bumping into everything. His cane might catch something but his head would catch what he missed.

He said new environments were almost impossible for him to navigate without getting lost or hurting himself, even with his white / red cane. But in an age of autonomous cars and 3D sound shouldn't the blind be able to detect the objects around them?

What it does

We are using Microsoft's HoloLens augmented reality glasses to make a realtime 3D map of the immediate environment and create a sound map of the objects in it. Based on where objects such as walls, people, and everything else are in relation to you we ping the object with a sound in your 3D environment based on its proximity and direction from you.

We use three different sounds.

  1. A Radar ping that sweeps back and forth 180 degrees in front of you. The ping's pitch and intensity varies based on the proximity of objects and their location.
  2. A parking sensor sound to alert you of a collision. Like in a parking car we alert you if you are about to hit an an object. Whether it is your head or your feet you hear the collision alert sound based on the object you are about to walk into.
  3. Mapping sound. This sound like an older telephone and alerts the user that the HoloLens is currently mapping the new environment and no object mapping information will be available for a few moments.

These three sounds enable users to track multiple objects around you and successfully navigate from point A to B without sight.

How we built it

We are using the Microsoft HoloLens to make a realtime map of the space the user is in. We're using the HoloLens' IR sensors to get the depth of everything in front of the wearer then bringing this map into Unity so we can determine where potential objects to avoid are located.

In Unity we use a ball that sweeps left to right along the mesh of the space producing the radar sound based on proximity of the mesh to the user.

Challenges we ran into

Learning how to develop on the HoloLens. Porting and updating the map from the device into Unity. Figuring out the best type of sound to use that is a good mix of information while not being too annoying. Sound design and learning to create an informative but also somewhat ambient sound was a large part of this project and learning how to manipulate sound in Unity.

Practicing. We found that no matter what method of detection we used we needed to practice for about half an hour with it to become confident walking around and avoiding obstacles.

Accomplishments that we're proud of

We made something that can actually help really people, like Yousef! That feels awesome! Plus we overcame a bunch of obstacles with a new system, doing more with Unity than we thought possible, and learning sound design.

What we learned

Learned sound design to make something that you can bear hearing for extended periods of time. We also got really good at Unity, C#, Visual Studio, and developing for the HoloLens.

About human hearing. Humans can't distinguish the direction of sounds if there are more than 3 simultaneous sounds. This lead us to use a radar approach where we sweep left to right so the user is only hearing one sound from one direction at a time and two sound maximum if they are about to hit something.

What's next for Sound Sense

We would love a chance to give it to Yousef to try out. We're sure he would be thrilled! We would also like to add a couple more features to make it even more useful for the visually impaired community. The first feature we would like to add is taking a picture and sending it to Google's DeepMind AI to describe the scene in front of the user.

The second feature we would like to add is Bing Maps integration so the user could hear turn by turn navigation in steps. "For instance 10 steps then turn 90 degrees to the right and walk 1300 steps." "Walk sign is on, please cross the street." HoloLens' hardware is not quite ready for that type of integration since it lacks a GPS chip, but we think that by combining the camera with a bluetooth connection to a phone where the GPS data would come from we can create a viable mapping system for the blind and visually impaired to get from point A to B.

Share this project: