-
-
This showcases the initiation code, which projects the walkthrough screens however does NOT repeat after the user has used the app once.
-
This code enables our machine learning implementation. Users open their camera then we run the ML model to identify the brand.
-
This is the code which implements our algorithms. We displays tips according to the answers to the questions and your carbon footprint.
Inspiration
Reducing your emissions is, frankly, hard. Buying from sustainable companies, tracking how much carbon you output, realizing the environmental mistakes you’re making are all very tedious tasks. We wanted an app which allows for an easy way to identify how to be more environmentally responsible, and in turn help make sustainable choices. We wanted to know how much carbon we output, what companies to buy from, and how to reduce our carbon footprint on a personal level, so we got to work!
What it does
Our app has 3 main components:
1) Carbon Calculator
Carbon Calculator allows you to take a short survey and then uses our algorithm to help calculate how much carbon you output.
2) Product tracker
Our product tracker uses Machine Learning and Computer Vision to identify companies and provide details about their environmental practices,
3) eMISSION tips
We use data from our carbon calculator to provide customized tips to help you lower your carbon footprint, including unique ideas which are not hard to do in your daily lives.
How we built it
We built our entire application using Swift, a programming language meant for iOS development. We placed a huge emphasis on good user interface, thus we used the UIKit and SnapKit frameworks to make better, more appealing screens within our application.
As for our machine learning model which is used to detect the company logos, we used CoreML and CreateML to build the models to identify companies by their logo. This took tons of labeling, image scraping and data gathering but allowed us to build an effective model which provided accurate predictions.
Finally, our carbon tracker was built using Firebase and Swift, including logging user data and running our algorithm to classify and quantify information to determine the user’s carbon output. The research aspect was lengthy, as we had to do many conversions and deep searches to find bits of information which combined to make an accurate algorithm.
Challenges we ran into
We ran into challenges processing with our machine learning model, specifically making a pre-processed image which we could run our machine learning model on top of. We eventually figured out how to make a pixel buffer which would easily allow us to run our model, and this simplified our implementation a great deal.
What's next for eMISSION
In the future, we hope to add machine learning to our carbon tracking algorithm to use prior data to make better and more precise calculations. We also want to expand our computer vision model to include more companies so users are not constricted to our selection.
Built With
- computer-vision
- coreml
- ios
- machine-learning
- swift
- uikit
- xcode
Log in or sign up for Devpost to join the conversation.