Camera embedded into view with a Snapchat-like homepage
Croppable images using a draw tool
Our neural network quickly returns information about the image
The user can log all of their food and its corresponding info
Hot streak calculating how many consecutive days the user has logged information into the system
Goal setting features and pie chart to determine distribution of macros in one's diet
“We all eat, and it would be a sad waste of opportunity to eat badly.” -Anna Thomas
We were inspired by our personal struggles to maintain healthy diets while in college away from home and surrounded by unhealthy meal choices. Our busy college lifestyles led us to prioritize a highly efficient logging system that does not require a significant daily time commitment.
What it does
"NutriSight is like X-ray vision for your tastebuds" - Justin Kula
NutriSight provides a streamlined mechanism for mealtime nutritional evaluation and persistent long-term dietary logging. The app also incentives healthy choices by encouraging continuous diet tracking in an enjoyable, compelling "gamified" social experience. Finally, it eliminates the need for time-consuming nutritional searches by encapsulating multi-component meal evaluation in a unified and streamlined process.
How we built it
We split our development team into separate client-side and server-side subgroups to facilitate rapid development and a cooperative workflow. Then, we tailored our choice technologies to the individual strengths of our team members to minimize research overhead and promote development speed. For example, Justin worked on developing and training the neural network from scratch, while Noah developed the front-end React App and Sanil designed the UI. We all contributed to our internal API that we build from scratch in Python and also graphically illustrated the architecture of the overall application prior to development to ensure the feasibility of our desired functionality.
Challenges we ran into
We were tempted to use a framework like Google Cloud Vision API or Amazon Rekognition because efficiently scheduling the execution of intensive neural network processing tasks on a resource-constrained realtime server was difficult. However, over the course of the hackathon, our homegrown neural network's runtime and memory efficiency drastically improved. Reducing the perceived latency of remote processing during user image analysis was also tough. Finally, providing a responsive and intuitive user experience on a variety of screen dimensions and processor performance levels was a challenge that we had to work through as a team.
Accomplishments that we're proud of
- Integration of high-performance native Swift modules to improve the responsiveness of essential UI components.
- Compiled Google’s TensorFlow framework using the Intel® Math Kernel Library to deliver an 8x speedup of our model’s neural network analysis time.
- Balancing the response-time constraints of a real-time application with the high overhead of long-running machine learning computations.
What we learned
We learned the importance of communication in a multi-member development team as we dealt with integration hurdles between client and server-side modules, and the importance of application responsivity as we addressed our largest initial criticism of poor recognition performance.
What's next for NutriSight
- Expand the size of our training data set to ensure the inclusion of cultural diets worldwide.
- Add additional achievements and competitions (such as a new-years resolution competition among friends)
- Add a feature that auto-crops an image into disjoint partitions given multiple food items on a plate