Globally, at least 2.2 billion people have a vision impairment or blindness, of whom at least 1 billion have a vision impairment that could have been prevented or has yet to be addressed (https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment) This is a huge section of the population who are deprived from benefits of advanced technology that could be used to better their lives and experiences. A simple task like shopping and reading labels and warnings on food becomes a burden which visually impaired people need to depend on others to perform. To address this and other such issues, we created MobilEyes.
What it does
At a basic level, this is a mobile app which used object recognition, OCR and accessibility services to identify objects in front of the visually impaired person and also to read labels for them using a text to speech engine. Modern smartphones all have accessibility features, one of which is the very common text to speech engine. We use this in conjunction with our app to make grocery shopping and buying food/medicines etc an easier task for the visually impaired. We also have a subsystem which can identify the generic medication name given a brand name (for example, Tylenol - > Acetaminophen -> Paracetamol) This is not completely implemented yet.
How I built it
We used Flutter for the Frontend. I used the image_picker and flutter_tts plugins found on pub.dev to implement some features of our app. The food detection algorithm uses a pre trained Convolutional neural network model that is deployed on flask and hosted on Clarifai. Each food label was trained with 1000 images, with a minimal error rate. The Label/ Text detection uses Optical Character Recognition based on Tesseract, hosted on Google Cloud Platform.
Challenges I ran into
We first had trouble making the client-side and the backend communicate through post requests, however we were able to solve this problem by reformatting the data that was being communicated via post requests.We also had some internet issues but thankfully we were able to pull through and we were able to integrate the app completely and effectively.
Accomplishments that I'm proud of
We are proud that we completed what we set out to do in the first place. Our app is fully functional and is able to read any kind of label or identify any food.
What I learned
Making accessibility focused apps and services requires good design and decision making skills.
What's next for MobileEyez
Medication identification system (partially implemented) : generic medication name given a brand name (for example, Tylenol - > Acetaminophen -> Paracetamol) We hope to implement more Machine Learning Algorithms so that the eyes can identify more and more things.