Inspiration

It’s easy to understand that taking shorter showers saves water, or how biking instead of driving reduces greenhouse gases. But most of us never see the massive amounts of water, greenhouse gas emissions, and land used to produce and transport our food.

Turns out that the way that we eat is one of the biggest causes of our most pressing environmental issues, from global warming to deforestation. Over a quarter of global greenhouse gas emissions come from food, and 33% of all ice-free land is used to grow our food!

Making small, incremental changes to our diets can drastically improve the dire conditions of our planet. We aim to encourage people to make better choices by automatically analyzing their impact.

What it does

Our app offers a suite of tools that empowers consumers and businesses alike to make sustainable choices when it comes to food consumption. We use computer vision and natural language processing to recognize food options from a photo input, either from a real-life photo or off a menu, and then subsequently visualizing the environmental impact of the food item.

From a picture of a food item, we can compute the environmental footprint created by its production in 3 different ways: the amount of water used, greenhouse gas emissions, and the area of land used.

The end result is a striking, informative graphic that helps the user understand the consequences of the different food options by comparing each metric to more easily visualizable numbers. For example, water usage is compared to the number of showers that use the same amount of water, while greenhouse gas emissions is compared to how many miles could be driven with the same amount of emissions.

It also provides a comparison to other foods to encourage more sustainable choices. Furthermore, if the input photo is a menu, our algorithm makes a data-driven recommendation of the best items to order off the menu based on these metrics.

We will provide consumers with an app that they can use to easily take a picture of whatever product they purchase or a menu at a restaurant to gain insight into their personal footprint. Meanwhile, our app also provides businesses such as restaurants with the ability to easily train their own models for their menu items, so that, for example, when a customer purchases an item, its impact on the environment is visualized. This empowers businesses that are interested in selling more environmentally friendly food items.

Our app not only educates users about environmental impact but also offers concrete, actionable steps toward a more sustainable future.

How we built it

We implement machine learning using Google’s Vision API, which allows for accurate food identification from many different categories, and Microsoft Azure’s Custom Vision API, which allows for powerful custom classification models, to identify food from an image.

The identified object is processed by a novel rating algorithm that uses state-of-the-art data from recent breakthroughs in sustainability research to calculate its relative environmental effect in three categories: freshwater use, greenhouse gas emission, and land use during production.

We use a Django framework to communicate these metrics in a streamlined manner, and store our data in a Firebase database.

Challenges we SUSTained

It’s tough to measure the exact numbers of water used, land used, etc. throughout the many different stages of food production. We use the best available numbers, but we recognize that there is a range of values and ours may not be perfectly accurate. Nevertheless, they give a good picture of relative environmental impact.

We discovered that Google Computer Vision API and Microsoft Azure Custom Vision API have their strengths and weaknesses. The former is fantastic at categorizing food, while the latter allowed us to create custom models that identified a special group of objects.

Accomplishments that we’re proud of

We effectively incorporate powerful APIs that enabled our app to go above and beyond, and mesh the Django frontend with the Python backend.

Perhaps most importantly, we are proud of designing a product that reflects our passions: to preserve the environment and educate people about an easily overlooked but critical issue (and incorporate some good wordplay!).

What we learned

Although we knew that increasing awareness about sustainability is an issue that we care deeply about, we were unsure about how to make something that actually influenced people’s behavior. After all, information is useless if it does not lead to action. Throughout the process, we discussed and explored different methods of achieving this goal; our group discussions led to results and realizations that would not have been possible alone.

We dared to explore frameworks, APIs, and new technologies that we had only heard of but never actually used. We made the most of this hackathon as an opportunity to take risks and explore these innovations.

We’ve learned that such a daunting problem can be met head-on by taking small but effective steps in the right direction, a mindset that will carry us through the continuation of this project and other challenges yet to come.

How we plan to SUSTain SUST

  • We hope to acquire more accurate and reliable data for the environmental impact of different foods.
  • To enable individual users to use this application on the go, we will make this into an mobile application.
  • We will also be initiating conversations with restaurants, universities, grocery stores, caterers etc. to accumulate more data and feedback for the custom machine learning model aspect of our app.

Built with

  • Python
  • Google Cloud API (Vision, Natural Language)
  • Microsoft Azure Custom Vision API
  • Firebase
  • JavaScript
  • Django

Sources for Environmental Statistics and Data

  • National Service Center for Environmental Publications
  • J. Poore, T. Nemecek, Reducing food’s environmental impacts through producers and consumers. Science 360, 987–992 (2018)
  • Behrens et al, Evaluating the environmental impacts of dietary recommendations, Proceedings of the National Academy of Sciences
Share this project:

Updates