To the able-bodied individual, eating can be one of humanities favorite pastimes. But what happens when you take posting that delicious fruit bowl to Instagram for granted? I thought everyone should be able to know what they're eating. So that's where the concept for this machine came from. I wanted to build something where even a visually impaired person can know what they're eating.

What it does

Just point the camera in the direction of any food item or items and tap the screen and when connected through a Bluetooth speaker or headphones, listen to the food item that is right in front of you! It can even recognize your favorite brands, so you know that what you're eating is what you want to eat! Just point, tap, and listen to EyeToldYou!

How I built it

I used a Rasberry Pi machine hooked up to a webcam to essentially be the "Eyes" and the touchscreen is the program launcher. Point the camera in the direction of your pantry, fridge, or fruit basket and the camera will essentially send the picture to the Google Cloud Vision API database, analyzing what exactly is in front of you and send it back to the Rasberry Pi, which when connected to headphones or a Bluetooth enabled device, will speak out loud what food item the camera is pointed at.

Challenges I ran into

Interestingly, I ran into issues whenever I didn't have a camera module for the Rasberry Pi. I found an old webcam and the Google Cloud Vision API does a splendid job of handling even low megapixel pictures. I also found it challenging to actually be able to take the pictures in live time without saving them automatically and sending to the API, all while being seamless.

Accomplishments that I'm proud of

This is actually my first time using any sort of hardware technology. I am incredibly proud that I managed to even make the Google Cloud Vision work! It's so interesting to me what with a connection to the internet and a small output device can handle. I feel as if I gained a new, refreshing perspective on hardware and how it intertwines with code. Really cool stuff.

What I learned

Hardware is easy! I did not realize that such a small machine could do so much! I can very much see myself using the Google Cloud API in the future with any apps I may develop. I also learned how to code with Matlab, use a Rasberry Pi and importantly, use API for powerful algorithms that one normally wouldn't able to do.

What's next for EyeToldYou

I feel next EyeToldYou machine should obviously have better hardware capabilities, better camera, sleeker design and it needs a way to be able to start with just your voice (since it's aimed at visually impaired people). I hope that this sort of tech makes its way into hopefully a portable sense, that way you could use these eyes for more things than just pantry ingredients! I have street walk-signs, menus, and traffic conditions in mind. If I were somehow able to just implement cellular internet and a smaller design, I could see people wearing this as a sort of pin on a shirt that sees everything you see and will notify you of whatever option you choose to be notified of!

Share this project: