Inspiration

A visually challenged friend who, one day, frustrated at his own helplessness, ranted how it was a struggle for him to do basic tasks like know what is in his food, know whether or not his laptop is switched on or not, etc.

We decided that we would do our bit to help empower him, and many others, for whom independence is priceless.

What it does

The app opens up to a camera that scans the barcode of any food item. It then goes into the next page where it displays and reads out loud the nutritional value, ingredients, and allergens of the item.

How we built it

We built it using react-native. We used the Chomp food and nutrition API to get information after the barcodes were scanned using the Expo BarCodeScanner.

Challenges we ran into

We wanted to make our app more interactive by adding the Google cloud speech-to-text API but were unable to fully implement it due to lack of time.

Accomplishments that we're proud of

We are proud that we could help empower fellow human beings by allowing them to be more independent and self-reliant.

What we learned

We learned that it does not take much to step up and help others. Most of us take a lot of things in life for granted that someone else would be grateful to have - it is important to be grateful for what you have and lend a helping hand if you have the resources to do so.

What's next for Eyegredients

A more interactive app that allows users to say out loud what food information they want, rather than having to hear the entire list. Eg. If you want to know about allergens only, that is what you will be told about.

We also aim to extend the functionality by adding features that allow users to know whether or not their food has expired, the shelf life of the food item, etc.

Built With

Share this project:

Updates