Inspiration π
We were up late at night brainstorming after eating a bunch of snacks. Then, someone picked up a piece of trash from the table, and... we knew what to do.
What it does π€·ββοΈ
It's a mobile app that identifies trash in an image and sorts it to determine which bin it should go into (recycling, garbage, etc.).
How we built it π¨βπ»
Training the AI model: The AI model was originally the YOLOv5 pre-trained model. However, it was not built for garbage detection, so we fine-tuned it with the TACO (Trash Annotations in Context) dataset for 16 epochs.
Making the app: We used Swift to create the iOS app, which consists of a camera that returns the images with identification boxes around the garbage.
Challenges we ran into π€
Finding a model and dataset: Finding an effective and accurate pre-trained AI model was very challenging. To solve this issue, we trained a not-so-accurate AI model with a dataset to increase the accuracy.
Training the dataset: Training an AI model took a long time. We found that using Google Colab's GPU would speed up that process.
Multiplatform support: We originally wanted to support Android devices, but ran into many problems with React Native and Flutter
Accomplishments that we're proud of π
We are very proud to be able to turn an inaccurate pre-trained model into a more precise version.
What we learned π«
We learned a lot about mobile app development and object detection using machine learning.
What's next for EcoVisionβ
One major next step for EcoVision is to expand its accessibility further than iOS devices (eg. to Android users). We can add features such as using hardware to create specialized trash cans that sort the waste or creating a device to inform you of the category without using a phone.
Log in or sign up for Devpost to join the conversation.