Inspiration
Globally, among 7.753 billion people, an estimated 49.1 million were blind and 33.6 million people had severe visual impairment. Visual impairment has caused inconvenience to their daily lives and routines. We believe that we can develop a technological assistive application to improve their quality of living.
What it does
Our app has 3 main functions: notice reading, obstacle detection and notification, community chat function.
- The notice reading function allows our users to be able to listen to notices that are in front of them just by scanning it with our app, which detects text and voices it out once a picture is taken, allowing them to be notified of important announcements about shops or public transports.
- The obstacle detection function helps our users to navigate pathways to their desired destinations safely, besides notifying them about interesting objects around them. By just using the video streaming function, we are able to have live object detection and notification functionality.
- The community chat function brings volunteers and the visually impaired community together in a virtual platform where the visually impaired can request help to identify important items that they encounter or any physical assistance from nearby volunteers, promoting a kind society. This is done through a video call system. Volunteers have a profile that states how much they have interacted with the visually impaired in the community.
How we built it
We built it using Android Studios, using TensorFlow pretrained models as our core for object detection, Optical Character Recognition for notice reading and Agora for video call feature.
Challenges we ran into
Coding a web app is difficult when we are unfamiliar with the software itself. As such, we had to do a lot of research about the various functions of the software.
Accomplishments that we're proud of
Being able to integrate everything together within half a day is quite an amazing feat for us, and we are very glad that it is functional.
What we learned
We learned a lot of tools for app development, deep learning and cloud based video call systems, besides User Interface design
What's next for Eh Eye
We can bring it to a wearable that is smaller than a phone, for example, smart glasses where things would be even more lightweight for our users. We can also implement more features, for example GPS tracking which automatically guides our users on bus timings or traffic conditions when they are navigating through routes.
Built With
- agora
- android-studio
- ocr
- tensorflow-lite

Log in or sign up for Devpost to join the conversation.