Inspiration

We want to make a difference to people that don't receive the support and quality of life that they should have to begin with.

What it does

Sonic guidance takes ultrasonic sensors and uses them to detect the distance between the user and the environment around them. We then use the data received from the sensors to calculate whether an object is a safe distance away or whether there is a large drop off. We relay this information to the user through both auditory and haptic feedback.

How we built it

We started off by writing off all the issues with the current solutions on the market. We looked to improve upon that. We began by testing each individual sensor to evaluate its functionality as well as figure out the capability of each component.

Challenges we ran into

The biggest challenge was finding the correct distance of the ultrasonic sensor data. Since the sensor is extremely sensitive to movement, the data we are given was all over the place. The next part was learning to use the particle photon. We wanted to use the particle photon for data transfer so we could better analyse the data. Initially the Photon worked fine but suddenly stopped.

Accomplishments that we're proud of

The most important accomplishment that we as a team are proud of was creating a product that will impact the lives of people that are visually impaired. We started off with nothing but a rough sketch of the design. At the end we

What we learned

We learned how to work together as a team and communicate data between hardware and software. We learned how to use ultrasonic sensors to create a system that sends data between a user and the system.

What's next for Sonic Guidance

We want to add data analysis to better get feedback and improve the solution as a whole.

Built With

Share this project:
×

Updates