Inspiration
At Rice, we rolled out a new composting program in the last month, where students are now able compost their food waste after eating. However, we've noticed that a. a lot of students were not composting and b. the compost bin often had contamination, such as plastic containers or aluminum foil, which aren't compostable. We wanted to find a way to encourage more people to compost and also compost well.
What it does
CompostID incentivizes composting and provides education on how to compost.
Users take a snapshot of their plate after eating, which is then run through image recognition provided by Google Cloud Vision AI to tell them what they can and cannot compost. After taking the picture, users are incentivized to compost again by seeing tangible numbers of how they helped the environment and receiving rewards, such as Tetra or discounts at businesses. By combining these two components, we hope that eventually, composting will become a sustainable habit that users will implement into their daily lives.
How we built it
After discussing our MVP, Taylor and Jessica worked on the design and color aesthetics in Figma, and Jing worked on the code using React Native and Typescript. Sierra did most of the research behind composting and why it's so important. Then, we connected Firebase and the Google Cloud Vision AI so that we could store and analyze images through the Vision AI API. It was all of our first times doing this, so it took a few hours, but the learning experience was a lot of fun and we all had our own takeaways. After working through the calls, we tested some examples, and wrote some test cases in the app so that we could demo a walk-through! All of the coding has been done using React Native.
Challenges we ran into
Only one of our team members is a computer science major, so she had to take on all the coding and technical aspects (thanks Jing!). That also meant that the rest of us had to learn the process of making an app and the different technical terms we kept hearing. But we did it :)
Accomplishments that we're proud of
We're proud that our app addresses a real problem in our community! In our college, we've hosted info sessions, Q&A's, and even literally sat by the compost bin to help people sort their compost, but none of them have the same potential impact of our app, which can reach far more people.
What we learned
We learned how to really draw from all of our expertise in a multidisciplinary environment. Some of us were good at coding, while others were good at research, while others were good at drawing and design, while others really liked Chipotle (their app is very nice!). Despite not being a conventional hackathon team (we have 2 bio majors/premeds, 1 stat major, and 1 comp major), we learned to adapt to that by drawing on our strengths to create something unconventional but also relevant to our community and the world.
What's next for CompostID
With the images collected from people taking pictures of their meals, we hope to be able to use machine learning to better identify compostable and non-compostable items, especially given the large varieties of plates or utensils out there. Because of our limited data science background, we were only able to use a pre-trained image recognition model, but it would be amazing to see more precise recognition that would be able to recognize specific items, and maybe even materials.
Built With
- google-cloud-vision-api
- javascript
- react-native
- typescript
Log in or sign up for Devpost to join the conversation.