The need of visually impaired people to live a life as normal as possible.
What it does
It helps mainly visually impaired people, but not only, navigate through a builduing, a set of rooms or a set of buildings.
How we built it
We based our application on Beacons. We combined android-sdk with Java and estimote-sdk. We also used Google Voice Recognition and TextToSpeech technologies for a perfect communication with the user, despite having to not use any visual-linked elements.
Challenges we ran into
We tried to build a high accuracy location model, but we were faced with the limitations of the Beacons technology. The Beacons do not return an absolute value of the distance to the device, but only an aproximation of it. The bluetooth signal is not constant, but it varies even though the distance is the same.
Accomplishments that we're proud of
We managed to build an interface that is easy to use by the visually impaired, and provides an aproximation of the location of the user (i.e. he might be at the checkout of the supermarket). We came up with an idea that can easily solve real-life problems regarding the visually impaired and not only.
What we learned
We have improved our teamwork, we have learned how to adapt to a new API (estimote-sdk) and we have become more experienced in writing Android and Java application.
What's next for EyeC
We hope that many other people will appreciate this idea and embrace it. This is only a demo and represents only a basis for this idea. We expect that in the near future the technology will evolve and our application won't be limited by the hardware problems anymore. We believe that the application will be expanded so that it will cover a wider range of real-life problems (i.e. fully guided, precise assistance for the visually impaired).