StepSight: Your Eyes, Your Path Empowering Independent Navigation with Real-time Obstacle Detection Problem: Navigating unfamiliar or challenging environments can be a significant hurdle for individuals with visual impairments. Traditional aids often lack the real-time, dynamic information needed to safely and confidently avoid unexpected obstacles, leading to anxiety, reduced independence, and potential hazards. Solution: StepSight is an innovative mobile application designed to enhance the safety and independence of visually impaired users by providing real-time obstacle detection and intuitive feedback. Leveraging the device's camera, StepSight identifies objects in the user's path and translates this visual information into actionable audio and haptic alerts, guiding them safely through their environment. Key Features: • Real-time Object Detection: Continuously scans the environment for obstacles using advanced AI (currently simulated for robust demonstration, with a clear path for API or on-device ML integration). • Intelligent Alert System: Prioritizes and filters detections to provide only the most relevant information, reducing cognitive load. Alerts are categorized by urgency (Urgent, Warning, Info) based on distance and object type. • Multi-modal Feedback: ◦ Audio Announcements: Clear, spoken alerts describe detected objects, their distance (in "steps"), and direction. ◦ Haptic Feedback: Vibrations provide immediate, non-intrusive warnings, with varying patterns for different alert types (e.g., strong pulses for urgent obstacles). • Intuitive User Interface: A clean, accessible interface with large, high-contrast controls. The camera view is overlaid with visual indicators (for sighted helpers or future enhancements) showing detection zones and object bounding boxes. • Customizable Settings: Users can adjust key parameters like audio announcement delay, haptic feedback intensity, and their average "step length" for personalized distance estimation. • Robust API/Simulation Fallback: Designed with a flexible architecture that can seamlessly switch between an external AI detection API and an intelligent local simulation, ensuring continuous functionality even without internet connectivity or if the API is unavailable. • Expo Router Navigation: Smooth and modern navigation between the Camera screen and a Home screen, providing a complete app experience. Technology Stack: • React Native & Expo: For cross-platform development (iOS, Android, Web) with a single codebase, enabling rapid iteration and broad accessibility. • expo-camera: Powers the real-time video stream and frame capture. • react-native-reanimated: Used for smooth, performant UI animations (e.g., pulsing alerts, glowing indicators) that run on the UI thread for a fluid user experience. • expo-speech: Provides text-to-speech capabilities for clear audio alerts. • expo-haptics: Enables customizable haptic feedback for tactile alerts. • lucide-react-native: For crisp, scalable vector icons across the application. • Custom AIDetectionService: An intelligent service layer that handles detection processing, filtering, object tracking, and alert generation (currently with a sophisticated simulation, ready for real AI integration). • expo-router: For declarative and efficient navigation management. Impact & Vision: StepSight aims to be a transformative tool for the visually impaired community, fostering greater independence, confidence, and safety in daily life. By providing crucial real-time spatial awareness, StepSight empowers users to navigate their surroundings with reduced reliance on others, opening up new possibilities for exploration and participation. Our vision is to continuously refine the detection accuracy, integrate advanced on-device AI models, and explore more sophisticated feedback mechanisms to make the world more accessible, one step at a time. Experience StepSight and envision a future of unhindered navigation!

Built With

Share this project:

Updates