Visually impared people are always dependent on other people for fulfilling their basic day to day activities. This makes them a sort of burden on the other person. Moreover, availability and compatibility is also an issue. So we have tried to make a stand-alone assistant for visually impared people

What it does

Felix, is a assistan device specially tailored for visually impared people. We have tried to bridge down the gap between normal people and visually impared people, by providing them with necessary tools and services. We have provided Object Detection, OCR, Navigation, Lane Guidance, SOS system and a dedicated two factor authentication system for easy and secure transactions

How we built it

We have used Azure Object Detection, Azure OCR API, for extracting information from the surroundings. We have used BIng Map API to provide navigation to the person, and used Open CV to have lane detection. Along with that we have SOS system backed by Twilio Messaging API. And the entire codebase is ignited and run by Alexa Skill.

Challenges we ran into

Uncertainity in the internet connections.

Accomplishments that we're proud of

Working prototype of the entire ecosystem.

What we learned

Azure API, ngrojk, VPN tunelling, IoT, Object Detection, OCR, OpenCV, Lane Detection

What's next for Kamikaze

More personalised devices for diffrentialy abled people.

Built With

  • azure-vision-api
  • azure-ocr-api
  • bing-map-api
  • flask
  • alexa
  • ngrok
Share this project: