Inspiration

One of our team members has an aunt who lost her vision and could not afford either a guide dog or the surgery that might have restored it. Seeing her lose independence because of cost made us realize how fragile mobility can be. We wanted to build something that could restore that confidence for anyone, anywhere. GuidePup was born from the idea that technology should help people move freely and live independently.

What It Does

GuidePup uses a phone or camera-equipped glasses to detect obstacles in real time and provide audio or haptic alerts. With built-in map support, it can also help users navigate to specific destinations safely and independently. Our goal is to make everyday mobility more accessible for people with visual impairments.

How We Built It

We connected Physical.inc’s smart-glasses hardware to a custom obstacle detection system built with computer vision and mapping tools. The camera feed runs through our detection model, which identifies nearby obstacles and sends instant audio or vibration feedback. We used Python, OpenCV, and the Physical.inc SDK to integrate real-time alerts with spatial mapping.

Challenges We Ran Into

We faced several technical and hardware challenges. Creating a shared state between navigation and object detection took time to prevent the two from interfering with each other. Connecting the glasses was another obstacle, as they were only compatible with Windows, not macOS. Finally, streaming the camera feed was difficult since we could not view what the glasses’ camera was capturing in real time.

Accomplishments We're Proud Of

We are proud of our idea as a whole, as it is a step toward making the world more accessible and affordable, especially for those with disabilities. We are also proud that we stepped up to the challenge of implementing a hardware-based solution, expanding the project’s scope beyond software. On top of that, we love the UI and how intuitive and watchable our demo turned out.

What We Learned

Every member of our team walked away with new skills and a new sense of what we were capable of. Some of us wrote our first lines of code, while others learned how to bring an idea to life through hardware for the first time. We spent long nights experimenting with Python, OpenCV, and the Physical.inc SDK, streaming video and refining our detection system to make it feel as natural as possible. This hackathon taught us more than just how to build something. It showed us why we build. We learned the importance of empathy in design, patience in problem-solving, and how even a small group of students can use technology to make someone’s world a little safer and more independent.

What's Next For GuidePup 🦮

We plan to continue rolling out new features and improving our detection accuracy and responsiveness. In the future, we hope to integrate GuidePup with larger platforms like Meta, Apple, or Ray-Ban to reach more people. Most importantly, since affordability is at the heart of this project, we want to partner with NGOs and nonprofits to make GuidePup free or low-cost in countries where blindness is most prevalent.

Built With

Share this project:

Updates