Inspiration Over 43 million people worldwide are blind, yet most assistive tools are still basic canes. We wanted to build something smarter, and without the requirement of a cellphone or hand-held technology. Solari helps users understand their surroundings, not just avoid them. It reimagines navigation with AI-powered awareness and intuitive feedback.
What does it do? The Solari Sleek Headset is an AI-powered wearable for the visually impaired. Using ultrasonic distance sensors, buzzers and a camera, Solari detects nearby obstacles and identifies key objects. It delivers real-time audio and vibration cues to help users navigate safely and confidently, both indoors and outdoors.
How we built it: Raspberry Pi with QNX OS for fast, reliable processing Raspberry Pi Camera 3 Module for real-time visual input Ultrasonic distance sensors for spatial safety feedback
Software: Twelve Labs API for scene analysis and object tracking Gemini API for generating concise, conversational scene descriptions Audio feedback via speakers and soft buzzers for haptics
Challenges we ran into:
QNX Camera Data Pipeline Issues We struggled to get the Raspberry Pi camera feed properly formatted for processing on QNX. The raw video data was not compatible with Twelve Labs, blocking real-time scene understanding. How we solved it: We split the workload. One Raspberry Pi runs standard Linux for camera processing and AI scene analysis. QNX handles the ultrasonic sensors and object detection system. This modular approach allowed both systems to run smoothly in parallel.
Outdated Raspberry Pi OS and Compatibility Errors Our Raspberry Pi environment was running an outdated OS version, leading to broken dependencies and compatibility issues. How we solved it: We performed a full system update and reinstalled key packages, restoring functionality while keeping resource usage minimal for wearable deployment.
Accomplishments that we're proud of: Built a fully functional AI-powered wearable prototype Integrated multi-modal feedback, including vision, sound, and touch, into a single user-friendly system Created a solution that empowers users rather than just alerting them
What did we learn? How to deploy real-time AI pipelines using QNX for safety-critical applications The importance of seamless hardware-software integration in assistive devices Human-centred design is key because accessibility is not just about tech, it is about experience
What's next for Solari? User testing with visually impaired individuals to refine the feedback system Miniaturizing the hardware for daily wear comfort Expanding AI capabilities to include facial recognition, text reading, and traffic signal detection
Built With
- c
- gemini
- python
- qnx
- raspberry-pi
- sensors
- twelvelabs
Log in or sign up for Devpost to join the conversation.