Many people in this society are visually impaired and we decided to make smokescreen, a device, and app that uses machine learning and object detection to assist with navigation
What it does
It uses a camera and multiple sensors to notify the user of their surroundings. The notification is done through 3d audio and vibrators on both sides of the user.
How I built it
We used a Raspberry Pi and trained an object detection model to detect objects in the real world. We then connected the Android app to the Raspberry Pi via Bluetooth. This way, the phone is able to receive data from the camera through the Raspberry Pi.
Challenges I ran into
It was very difficult to get the Android app connected to the Raspberry Pi through the Bluetooth system.
Accomplishments that I'm proud of
We were able to get Object Detection working successfully, which was a cool thing to see.
What I learned
How Android apps and embedded devices work and what they can do.
What's next for Smokescreen
Added internet capabilities, improving sensor accuracy, and more.