Inspiration
We wanted to help blind people navigate with a better system than a simple cane.
What it does
It gives an accurate representation of the environment in the front view of the user. The device uses
How we built it
We divided our design challenge into a set of problems that we knew that we needed to solve. -Collecting information about the world through an intelligent computer vision system with an edge device -Distributing key information about the visual cues of the world to the user -Objects at different heights -Objects moving at velocity
We developed the best toolkit applications to generate solutions to the problems that we picked, but retained all other ideas. These options helped us when we met development problems with our initial approach.
The design collects inputs from a visual feed and performs machine learning processes through a low cost webcam. A Raspberry Pi Zero acts as the edge device for us. The installed python script utilizes a multi-threading module to collect and process visual data and deliver digital output to servo motors. Our team uses the Raspberry Pi's digital output channels, and 5 volt generator to operate an array of 6 servos; Each conveying information about left-right direction of objects and near-far status of objects. Our implementation plan is for the servos to be mounted to wearable platform.
Challenges we ran into
A problem that proved difficult to solve, was an effective way to notify the user of approaching objects. The AlwaysAI platform and APIs instantly proved useful, but our team needed to iterate beyond an initial electro-mechanical device before realizing the best solution to "tap" the user was to with mini-servos.
Accomplishments that we're proud of
We're proud of out the team's ability to innovate. and identify a project that we each felt would challenge us, and that would demonstrate how AlwaysAI could be used to help bring accessibility to humans, as well as machines. We entered this event as strangers to one another, and we feel accomplishment in the way we grew into a team. We are also proud of finishing a product in the span of 24 hours considering the amount of time we spent on discussing the project.
What we learned
We all gained experience in the Linux operating system of the Raspberry pi. This was also the first Hackathon for most of us so this taught us how to work in high pressure and time limited situations with limited parts.
What's next for Visual Augmentation Aid for the Blind
Instead of using makeshift vibrating sensors through Sevo-Motors, we can use an actual vibration sensor which would be much easier to implement.
Built With
- alwaysai
- computer-vision
- linux
- neural-computing-stick
- protoboard
- python
- raspberry-pi
- sevo-motor
- soldering
- webcam
Log in or sign up for Devpost to join the conversation.