Inspiration

It’s a pain to have to buy a service dog these days, costing about 15,000 to 50,000$ and even if they do pay, some will have to wait a long time before they get the help they need to walk around. That's why we built No Way Home, an app that guides visually impaired people to where they need to go safely without spending thousands of dollars on dog food and vet visits.

What it does

No way home uses multiple AI integrations to detect and warn visually impaired individuals around obstacles. Using their phone's camera, No Way Home detects obstacles. It also accepts voice commands for navigation, which allows the user to navigate around detected obstacles without seeing them.

How we built it

No Way Home is a web application that uses HTML5 camera streaming APIs to stream video data from the phone camera. It feeds these frames into an object detection model invoked through Tensorflow.js (TF Lite) to detect obstacles. When an obstacle is detected, we use ElevenLabs text-to-speech to warn the user. We also can take voice commands from the user using ElevenLabs speech-to-text. Voice commands are then translated on the backend using an LLM from Featherless.ai to semantic commands that the application can understand.

Challenges we ran into

We trained our detection system in YOLO, so it was difficult to get it to work in the browser. We tried TFLITE in the browser, but it took a while to get it to work because of how tensorflow outputted model results. Additionally, we also ran into the text-to-speech loading slow from the server. We realized that this was because it took a while for audio files to load onto the client, so to solve it we preloaded common TTS snippets.

Accomplishments that we're proud of

We are proud of getting the detection system working because it was acting up and we did lots of trial and error. We are also proud that the backend is able to interpret voice commands from natural language.

What we learned

We learned how to use various AI integrations like our ElevenLabs and Featherless integrations.

What's next for No Way Home

Getting it to detect people and how far obstacles are from the camera, and have more complex navigation functionality. Detecting more obstacles than just trees.

Built With

Share this project:

Updates