Inspiration
We were first motivated by the idea that we could use a camera to detect objects. JP Morgan's Challenge inspired us to create this application to help the visually impaired. Google Cloud and Microsoft Azure inspired us to use the wonderful existing APIs.
What it does
Our Third Eye gives the visually impaired the ability to complete tasks without the help of others or the use of unnecessary tools.
How I built it
There are two components:
1. Intel RealSense Depth Camera D435i
The Camera was used to track the distance the user is from any object in its surroundings and provide realtime data to be processed.
2. Website App utilizing Firebase Real-Time DB and Azure Computer Vision
The Web App was used to pull data from Azure and Firebase and alert the user of their surroundings.
Challenges I ran into
- Making sure the camera was capturing the correct object and distance
- Processing the data from the firebase and alerting the user using text to speech and vibrations
- Lag from the camera to the database to the app.
Accomplishments that I'm proud of
All alerts and notifications are as close to realtime as possible.
A better understanding of ML
Objects detected are accurate
What I learned
Machine Learning
How to use firebase API
How to use Azure
What's next for ThirdEye
Increasing Accuracy and Speed
Perform more complex Tasks
Integrate the camera into Smartphone Cameras
Develop the App for iOS and Android
Log in or sign up for Devpost to join the conversation.