Inspiration
We wanted to help a group often overlooked by society-- the blind.
What it does
Helps the visually impaired regain their independence by providing them with tactile feedback to avoid obstacles and text-to-speech which describes the surrounding world.
How we built it
- AlwaysAI to implement object recognition
- Arduino Uno to control proximity sensors and tactile feedback
- Raspberry Pi to communicate recognized objects to user via speech synthesis
- Laser cutting to custom design a frame
Challenges we ran into
- Working around the poor range and precision of ultrasonic proximity detectors
- Powering the multitude of servos and sensors
- Setbacks when trying to integrate text-to-speech library with AlwaysAI
- Not enough computational power on Raspberry Pi to utilize AlwaysAI
Accomplishments that we're proud of
- Designing a portable power supply to support multiple servos, sensors and an Arduino
- Custom designing a flexible, adaptable frame to support the electronics
What we learned
- How to work under high-pressure deadlines
- Integrate sensors and peripherals with Raspberry Pi and Arduino
What's next for Sonic Navigation and Object Recognition
- Making the design more compact, discreet, and lightweight

Log in or sign up for Devpost to join the conversation.