Inspiration
This project was inspired by both Children's Health and the PepsiCo challenges.
What it does
The is to make it easy for patients in a hospital setting to log the foods that they are eating. Clinicians can see a record of the food consumed at time consumed. The patient simply has to place the food package in front of a small camera to log the food. The camera sensor could be mounted on a bedside table.
How we built it
The camera is connected to a Raspberry Pi board which also has a wifi chip for network connectivity. When the camera detects motion it snaps an picture and then calls a backend API to detect the item in the camera frame, convert the labeled image into spoken text (MP3 file) and then plays the audio recording for user feedback purposes. Time image is saved to an S3 bucket and record is saved to an InfluxDB database for further analysis and physical analytics.
Challenges we ran into
We had a hardware issue with a malfunctioning camera that required troubleshooting and had to limit the scope of objects detected to just to products (Doritos and Pepsi) to finish the project on time.
Accomplishments that we're proud of
Although more limited in scale than we hoped, we were able to ship something by the end of the weekend.
What we learned
We learned how to train a custom computer vision model using the AWS Rekognition brand recognition service.
What's next for BedsIOTde manner
We want to continue working on this project even after the hackathon terminates.
Log in or sign up for Devpost to join the conversation.