Inspiration

Guide dogs can help people with disabilities navigate the world and warn them when danger is impending. However, these dogs cost tens of thousands of dollars so not everybody can afford them. We sought to emulate some of a guide dog's collision warning functionality to help those who are hard of hearing.

What it does

Third Eye uses stereo cameras and computer vision to detect, classify, and track objects moving towards the user and calculates whether they are likely to collide. If so, the user is alerted and gain those few crucial seconds to get out of the way or brace for impact. This could help people with sensory disabilities avoid collisions.

How we built it

Uses stereo cameras to derive depth data of objects and calculate their trajectory and likelihood to collide. Uses image recognition to sort objects and developed a collision detection algorithm using simple dynamics. Notifies you with an android app.

Challenges we ran into

A lot of the hardware we required wasn't available and there was no 3d printer to make some of the parts we needed for the mount. We had to made do with what we had (what's a hackathon without some ramshackle shenanigans?).

Accomplishments that we're proud of

We're proud to have made something that could eventually be used to help somebody with a disability better navigate the world and hopefully save them from a collision.

What we learned

Coming from diverse backgrounds, we all learned from each others' specialties. Hardware, backend, and frontend all coming together.

What's next for Third Eye

Hopefully we will further refine the UI to be able to be used practically by the hard of hearing.

Built With

Share this project:
×

Updates