Almost all responsible citizens and high schoolers who know about climate change see climate change as an issue and acknowledge the need to be more environmentally friendly. But when we learned that many manufactured goods have CO2 footprints of up to 200kg, and (a study by a German manufacturer) that just 1 year of cell-phone use has a CO2 footprint of 1.45 tonnes(!!!), we realized how huge of a gap exists in our dealings with industry produced emissions, the manufactured and agricultural goods that dominate our lives.

What if?

We plant trees, reduce driving, and but what people are not realizing is that a major contributor of carbon emissions are the objects that we use on an everyday basis! Industry is a huge contributor to climate change, and we were inspired to figure out some way to make it possible for average people to do something about it.

What is does

Greenway provides a fast, easy, and interactive way to learn about how manufacturing of your belongings, like cellphones, headphones, and backpacks, contributes to carbon emissions, and much more importantly, what the user can do to reduce this footprint. The basis behind it is that you scan any everyday object, and Greenway will identify the object and instantly inform you on its CO2 footprint, and green alternatives of the object you scanned.

You will receive Youtube links, DIY tutorials, and online articles detailing ways you can improve on the object that you scanned. So say you scan a shoe, you will be given a footprint (in kg), videos about what shoes are carbon friendly, shopping options where you can directly go to sellers of eco-friendly shoes, and a list of eco-friendly alternatives, such as kelp shoes (along with links to articles where you can learn more about these alternatives). To increase simplicity of the app, we condensed it to 3 easily navigable pages, and built it with an intention of making the process as effortless and reliable as possible.

Aside from your scans, the entry page also features other people’s submissions, so that way you can get other ideas for places you might want to improve. It will encourage global trends, and help our knowledge spread even faster. Users will see what friends, and other random users on the network have scanned, and can take inspiration to make similar changes. Our solution makes the process of finding and using eco-friendlier options to all sorts of commodities orders of magnitude easier, and thus contributes to the climate crisis as a whole.

How we built it

Who doesn't want image recognition?

We built this app using a combination of cool technology. We used firebase cloud ML image recognition for our image recognition segment, training our cloud model on over 1000 classes so that users would accurately be able to scan practically anything in their day-to-day lives. For our web-scraper, we used beautifulsoup and selenium drivers to crawl bing search results for the most relevant values. We had 3 web-scrapers, one for scraping relevant shopping options and gathering associated data, one for finding alternatives for a given item by name and replacing these, and another for scraping carbon footprint of items via public available studies and articles. For the list of videos of DIY solutions, product reviews, and re-using techniques from youtube, we used the youtube flutter sdk.

Simply Not!

To host all of our python code and interface it to our app, we had to serve it using a flask server. For now, we hosted it on a local server, so it's not universally accessible, but we plan to containerize and host it on a gcp app engine instance in the future. To store all of the users scanned entries for later viewing and reference, we used firebase firestore, storage buckets, and user authentication. Finally, perhaps most important of all, we used flutter, an awesome app dev framework that lets you get apps working in both iOS and Android, and makes beautiful, intuitive UIs really easily. We used flutter plugins for some cool UI elements and for some animations.

Challenges we ran into

Perhaps the most irritating of all bugs in our app was this weird async loop bug, in which we had a tiny error with datatype conversion, but because it was in the async loop, the error was hidden, leading us to think that it was actually an issue with our API code.

When you spend 5 hours debugging async code...

We ended up figuring out the issue, but only after 5 hours of debugging! We also ran into a ton of trouble figuring out how exactly to scrape the results that we wanted. Our UI was mostly smooth sailing, but ironically, we also had a really rough time uploading images without their background for assets like our logo and our starting screen. Firebase build issues were also SUPER, SUPER irritating.

Accomplishments we’re proud of

We were SUPER proud of our UI, it's a bit simple, but we thought that it looked really clean once we finished. We put a ton of effort in making our app look super nice with all of the extra time that we had. We are also proud of our entire server stack since it was a ton of work, and was one of the most complex scraping systems we've made to date.

So Proud!

Our image recognition was actually really easy thanks to firebase transferrable models, but we're really proud of how it turned out (watching our ConvNet magically classify objects never gets old). Most of all, we're really proud of our final product, and being able to say that we made a climate change solution that we would actually use in real life. We're really looking forward to adding new features and putting this thing into the world!

What we learned

This hackathon was an awesome experience. We learned a lot about team-building, ideating, and creating effective solutions to real world problems with our tech skills. Our UI guys learned a LOT about new widgets, designing different navigation systems, and creating stateless widgets, all of which were firsts for us. Our backend learned about how convenient Firebase Core ML is compared to hand training an AlexNet model, how to build even deeper webscrapers through searching of keywords, headers, and then dynamically using this information to search for more relevant information in internal hyperlinks.

We made a ton of mistakes and learned how to address them. Most importantly we learned about how much CO2 we consume (a lot!) by testing our app on backpacks, phones, shoes, stationary, and droves of other household items. We're quite frankly shocked, but now we know what we can due to change this (kelp shoes, here we come). We're super encouraged by the fact that WE learned through our app things we never would've otherwise imagined, and have such easy access to solutions.

What's next for GreenWay

Our app really targets younger audiences of millenials and teens such as ourselves. We believe that we can make our app the next big thing by challenging users to track their CO2 footprint, and be able to post themselves on social media like Instagram and Twitter trying out some of the cool solutions that our app suggests, directly from the app, using APIs. It's obviously a must to test out and run our app through trials, we'll need bug fixes with things like the ML model, which isn't where we want it to be, and refining our scraping algorithms to apply even wider, and be even more consistent than before.

We want to obviously put it out onto the app store, and maybe even try out a cool idea with VR where users would be able to look around their room and actually see (in red/green) our algorithms analyze all the objects in their room. We want to build features to develop user interest, so that our app can actually make an impact on the issues we highlighted.


Dylan Syahputra - nalyd#1349 | Prasann Singhal - PrasannSinghal#9723 | MichaelMohn - MichaelMohn624#6613 | Aditya Agrawal - Ballistic [streamer]#2729

A little something extra!

Built With

+ 4 more
Share this project: