Inspiration I was inspired to build Saathi after noticing the challenges visually impaired users face while navigating busy urban spaces. Existing navigation tools often rely heavily on visuals, leaving a gap for users who need real-time, audio-first guidance. I wanted to create a companion app that provides independence, safety, and confidence for everyone who benefits from auditory navigation, including those with reading or visual challenges.

What it does Building Saathi taught me how to: Integrate AI-powered voice guidance for real-time navigation. Handle geolocation data efficiently to provide accurate directions. Design with accessibility-first principles, ensuring the app is usable by a diverse set of users. Optimize latency and performance for smooth, instant audio feedback.

How we built it Frontend: React Native for cross-platform mobile app development. Backend: Node.js with Express for API handling. AI & Voice: Integrated text-to-speech and natural language processing for contextual guidance. Maps & Navigation: Used Google Maps API for routing, with real-time GPS updates. Accessibility Features: Audio cues, voice commands, and haptic feedback for inclusive navigation.

Challenges we ran into Ensuring real-time accuracy of directions in crowded or GPS-challenged areas. Designing a minimal, intuitive interface that works entirely through voice and touch. Handling edge cases, such as unexpected roadblocks or sudden changes in route. Making the app inclusive, useful not just for blind users but for anyone who benefits from audio navigation.

Accomplishments that we're proud of Developed an AI-powered, voice-first navigation app for accessibility. Enabled real-time, safe, and independent navigation for visually impaired users and others who benefit from audio guidance. Integrated text-to-speech, haptic feedback, and contextual directions for a seamless experience. Built a cross-platform mobile solution with smooth performance and minimal latency.

What we learned How to design for accessibility-first experiences that are inclusive beyond visual impairments. Optimizing real-time navigation algorithms for accuracy in urban environments. Integrating AI and voice technologies into mobile applications. Handling edge cases in mapping and routing while maintaining a smooth user experience.

What's next for Saathi Meta glasses integration for hands-free, immersive navigation. Haptic sensors to provide intuitive tactile feedback for safer movement. Public transit support with real-time updates and step-by-step guidance. Expand accessibility features to serve a wider range of users who benefit from audio and haptic navigation.

Built With

Share this project:

Updates