💡 Inspiration

Blindness and vision impairment affects at least 2.2 billion people around the world. Of those, 1 billion have a preventable vision impairment or one that has yet to be addressed. Reduced or absent eyesight can have major and long-lasting effects on all aspects of life, including daily personal activities, interacting with the community, school and work opportunities, and the ability to access public services. Visual impairment also causes diseases like diabetes. Imagine someone has allergies or some other dietary restrictions then it becomes very tough for them to choose food products and know if it's suitable(if they are allergic to any ingredient) for them or not especially when they are alone.

🤳 What it does

Eye.ai is a simple mobile app that helps visually impaired feed data about themselves by answering questions like if they have diabetes or some allergy and then help them in identifying packaged food products if it is suitable for them or not.

⚙️ How I built it

I built this app using flutter, a cross-platform development framework that allows us to use the same code base to develop the app for multiple platforms like iOS, Android, Web, etc. I used TensorFlow lite for building object detection model and deployed it on google cloud and flutter text to speech and speech to text plugins.

🤔 Challenges I ran into

I had issues in using text-to-speech plugins and spent a lot of time watching tutorials on using ML with flutter.

🏅 Accomplishments

I am proud of what I built and the new things that I learned along the way.

💭 What's next for Eye.ai

Improving the Ui is the priority, adding more profile Q/A to provide a better experience and a better model that can recognize more packaged items.

Built With

Share this project:

Updates