Most people want to reduce their carbon footprint. According to a 2016 Pew Research Center Survey, 75% of U.S. adults say they are particularly concerned about helping the environment as they go about their daily lives. However, only one-in-five Americans say they make an effort to live in ways that help protect the environment “all the time.” How can we redesign current technologies that attempt to make it easier for Americans to make an effort to live in ways to help protect the environment?
The current technology is not easily personalizable to the user’s ability to contribute to the environment. For example, EcoCRED is an app that measures and tracks your carbon footprint and motivates you to build eco-friendly habits that make a difference. EcoCRED has the same goal in trying to get people to reduce their carbon footprint and take action, but fails to give personalized recommendations for each user. Unfortunately, the action of having to input all of your data into the app to receive results turns some users away.
Our app will skip this step of inputting data through object detection, and open up users to opportunities that they may not have seen before by scanning their home environment. Our app motivates people to make small design changes in their homes that have a huge impact on the environment and their carbon footprints. Each home is unique and has different needs and ReHome will personalize feedback for how users can improve their carbon footprint based on items and situations in their home.
What it does
ReHome uses object detection technology with a neural network to calculate a carbon footprint for your home. With the app, anyone with a home can scan their home in less than a minute, getting productive feedback on how they can improve their carbon footprint. As users scan their home, objects and situations, such as plastic water bottles, paper towels, and multiple lights being turned on, are identified and highlighted in green or red. Once an object/situation has been identified, the user is able to tap the highlighted area and see statistics on how they can make small changes in relation to the object that was detected and how that will impact their home’s carbon footprint. This will allow the user to “be the change they wish to see in the world” and witness their quantifiable impact on the environment. Additionally, users are able to see how much they are hurting the environment, hopefully giving them a wake up call and encouraging them to make an effort to “live in ways that help protect the environment”.
Another feature of the app that will motivate users to redo their home in an environmentally-impactful way is the social aspect. Our prototype has a leaderboard in which friends can compare their carbon footprints with each other. Once a user is done scanning their home, they get an environmental impact score that is either “in the green” or “in the red”. Then, the user can share a link to the app to as many friends as they would like, encouraging friends to “ReHome”. Our app introduces a bit of friendly competition challenging users to witness their environmental impact and make small positive changes in their own homes.
How we built it
We built the prototype using Figma, which allowed us to visually represent how the app would look on a user’s perspective. We focused on highlighting the main features of our idea, specifically the aspects of competing with friends, and scanning the room. We introduced the competition aspect with the leaderboard status, and the scan room feature with the camera roll and live picture/video option. There are other more detailed features such as a settings and privacy page, creating and updating profile, and inviting friends to join features that we did not fully implement in this version as they were not central components to our product.
Challenges we ran into
All of us were new to Figma, so we had to learn how to navigate the software to bring our ideas into fruition. Some of the things we learned to do were implement loading screens, create a leaderboard, scale pictures, implement vertical scrolling, and create buttons to navigate to different pages.
We also ran into a couple of implementation challenges within the object detection neural network. We managed to come up with some creative solutions to work around these problems, and they are detailed in the “What we learned” section.
Accomplishments that we’re proud of
We are very proud of innovating an existing product enough to the point where our app can be a product on its own. We are very happy that we have come out of ReHack with an app that we will continue to develop and launch into the marketplace as soon as possible. Climate change is not going to go away on its own, and we think that ReHome will have a measurable impact on the world through changing homes one at a time.
What we learned
We learned how to use Figma to create a prototype of an app that allows for interaction. In addition, we learned how we can individually change our carbon footprints in our homes, and we want to make this information easily accessible for all through ReHome.
What's next for ReHome
To detect objects within an image, we only use existing technology. We will use a YOLOv3 neural network, which is a model that, when fed an image, (1) classifies objects and (2) localizes them (i.e. draws a bounding box around them). This can be coded in Python with relative ease, provided that there is a robust, annotated dataset of images. Although imagenet is a massive dataset that most large CV models train on, it does not take into account the specific objects that our model aims to classify and distinguish (i.e. plastic vs. reusable water bottles). Because of this, time will be needed to manually construct a dataset for the YOLO network to train on, ensuring that a variety of types of images are incorporated to ensure the model is generalizable (we want it to work for all rooms, not just the rooms of the developers!). One possible method to assist in this data collection is offering users a reward (for instance, 5 points), for sending a labeled picture of their room, which can then be used for training data. Then, we will use these optimized weights to detect objects for the user’s room picture, which lack labels.
After the objects have been detected, they will be fed into a predetermined dictionary of objects and their corresponding sustainability scores. Sustainability scores will be determined based on the typical carbon footprint of one unit of that object. If, for instance, a lotion has a carbon footprint of 3 kg CO2, then perhaps 30 points will be subtracted from the user’s score. The total score is the sum of the scores for every object, and is what will be returned. In addition to containing the score, the dictionary will also contain a string that provides the user with context on the effect each object has on the environment. For example, if the YOLO network detects a slice of cheese in the picture, then the user will be provided with the pre-written following message: Cheese has a massive carbon footprint, with 12kg of CO2 produced for every kg of cheese. Suggestions: vegan cheeses have a significantly smaller carbon footprint. See Miyoko’s or Daiya for options. Images with links to buy each item will also be shown.
Users will be able to compete with each other based on their cumulative score, which is based on different rooms of their house (kitchen, bedroom, bathroom, and living room). A leaderboard will be established that ranks users based on their scores.
Our business model would consist of partnering with sustainable companies (Hydroflask, Beyond Meat, Seventh Generation, Preserve Toothbrushes, and many more) to advertise their environmentally-friendly products as recommendations and replacements for users. Since 75% of U.S. adults say they are particularly concerned with helping the environment as they go about their daily lives, ReHome has a potential user base of over 150 million Americans, alone. We believe we can access this user base through making our product easy-to-use and very personalized for consumers. This product is easily scalable through its business model.