Our Pitch

Throughout this Hackathon, we have been working hard to create a fantastic piece of software called LightVision, an app which enables blind people to see what’s in front of them.

1.3 billion people in the world are living with some sort of visual impairment; this is a huge number and they deserve to be catered for.

Using TensorFlow, Google Cloud and machine learning, our app will scan and detect what’s in front of the user’s camera. This essentially becomes the blind person’s eyes. It will then proceed to read aloud the result if the user taps once on the screen, so that they know what's in front of them.

Our mission is to help those with disabilities achieve the same quality of life as fully-abled people. We may not have a fully-fledged product here, but it’s one step closer towards a better world.

FAQ:

Q. How much will this app cost? A. This app will be completely open-source and free to use, as we believe that all people deserve the same quality of life and this shouldn't have to be bought.

Q. What systems do I need to use it? A. To use the app, all you need is a device with a camera and speakers, which is standard.

Q. How can I learn how to use it? A. You do not need any special skills to use this app, as you simply hold an object up to the screen, and tap to hear what the object is.

Q. Will this work for those with hearing impairments? A. LightVision is specifically designed for those with vision impairments. Those with hearing impairments are able to use it if they can see, since the output is also displayed on the screen. Unfortunately, we do not currently support output for those who are both deaf and blind, however, we have plans to change this in the future with the use of haptic devices.

Q. How accurate is it? A. The accuracy is currently very low, as the model has not been allowed enough time to train properly. In the future, we will continue this project by adding a deep-learning model that gradually becomes smarter over time. This is already in production, the code has been uploaded by Kristoph.

Q. What can it describe? A. LightVision can currently describe a pre-selected range of everyday items, including water-bottles, phones, t-shirts etc. In the near future, we will expand this database so that it is able to recognise more items. As the project grows, we can add more and more items.

Q. Do you see the app complementing existing aids e.g. walking sticks? A. Definitely, walking sticks are able to tell you how close something is but they cannot tell you what the object is in front of you. I believe these two aids go hand-in-hand and work together well to provide a better life experience for the blind.

Q. How is this different to other apps on the market? A. This isn't just a fun app you'd use on your daily commute, this is essentially a life aid which could help people in their everyday lives.

View GitHub: https://github.com/stephyx/HackTheSouth

Built With

Share this project:
×

Updates