We built the Accessible Spatial Assistant for FronteraHacks to help visually impaired individuals navigate chaotic supermarkets with confidence. Our app transforms the shopping experience using a voice-controlled interface that you can manage entirely through speech and haptic screen vibrations. To build this, we used Flutter for the mobile app and Python with FastAPI for the backend. When a user says they want to cook a specific meal, our AI breaks it down into a grocery list and checks local supermarket inventory in real time. Once at the store, the app uses the phone's live camera to scan shelves, identify the exact ingredients, and guide the user directly to them using spoken audio cues.
Getting the app to run smoothly on a physical iPhone was our biggest hurdle. We had to battle Apple's strict security rules, strip hidden metadata files to get the camera working, and figure out how to force the phone to speak out loud even when the physical silent switch was engaged. Through this, we learned a massive amount about mobile hardware and designing sight-free interfaces. We also learned how to optimize the physical shopping route itself by implementing a heightened Dijkstra's algorithm. By treating the store's aisles like a graph, this algorithm continuously calculates the shortest possible walking distance from the user to their next item. This ensures our users don't just find their groceries, but are guided through the store using the absolute quickest and most efficient route possible.
Log in or sign up for Devpost to join the conversation.