Inspiration

It all started with a simple question: What if the world could speak to those who can’t see it? One of our team members shared a story about their visually impaired cousin, who often felt trapped by the simplest things—like reading a menu or avoiding a curb. Expensive devices like smart canes were out of reach, and they relied heavily on others for help. That broke our hearts. We wanted to create something that would give freedom, safety, and dignity back to the visually impaired—using a tool they already have: their smartphone. That’s how AidVision was born, a project to turn the invisible into something vivid and empowering.

What it does

AidVision transforms a smartphone into a pair of intelligent "eyes" for the visually impaired. It reads signs, boards, and text aloud, detects dangers like obstacles or traffic, and narrates surroundings in real-time through natural voice output. Imagine walking down a busy street and hearing, “There’s a low-hanging branch 3 steps ahead,” or “The sign says Bus 42 to downtown.” Whether through a speaker or earphones, AidVision guides users with clarity and confidence, helping them navigate the world independently—safely and with dignity.

How we built it

We poured our hearts into making AidVision both powerful and accessible. We used YOLO for lightweight object detection and Tesseract for text recognition, ensuring real-time performance even on low-spec devices. For voice output, we integrated Google TTS to deliver clear, human-like narration. The app was built with React Native for seamless cross-platform deployment on iOS and Android. We optimized it for low-bandwidth environments and ensured critical features work offline, so users can rely on it anytime, anywhere. It was a sprint to balance performance with accessibility, but every line of code felt like a step toward impact.

Challenges we ran into

Building AidVision wasn’t without its hurdles. Optimizing YOLO for low-spec devices was a struggle—early tests were sluggish, and we worried users would be left waiting in critical moments. We also faced issues with Tesseract’s accuracy on blurry or angled text, like street signs. Balancing offline functionality with performance was another headache; we had to prioritize features without compromising reliability. Late-night debugging sessions and countless cups of coffee later, we fine-tuned the app, but those challenges taught us resilience and the importance of user-centered design.

Accomplishments that we're proud of

We’re incredibly proud of creating an app that truly empowers. During testing, one of our visually impaired friends used AidVision to read a café menu on their own for the first time—they smiled so wide, and we knew we’d made a difference. We’re also proud of how lightweight and accessible we made it: AidVision runs smoothly on older smartphones and works offline for essential features. Seeing our vision come to life as a practical, impactful tool feels like a win—not just for the hackathon, but for the people we’re helping.

What we learned

This journey taught us so much. Technically, we deepened our skills in computer vision, OCR, and mobile development, learning to optimize AI models for real-world constraints. But more importantly, we learned empathy—how to design for real human needs, not just theoretical ones. We discovered the power of accessibility in tech and the importance of testing with real users. Every feedback session reminded us that behind every line of code is a person counting on us to get it right.

What's next for AidVision – Empowering the Visually Impaired with AI

AidVision is just the beginning. We want to add multilingual support so it can serve users worldwide, and integrate more advanced danger detection, like recognizing moving vehicles’ speed and direction. We’re also exploring haptic feedback for users who prefer vibrations over audio cues. Our dream is to make AidVision open-source, inviting a global community to improve it and ensure it reaches every visually impaired person who needs it. This isn’t just an app—it’s a movement to make the world more inclusive, one smartphone at a time.

Built With

Share this project:

Updates