Inspiration
91% of plastic doesn't actually get recycled. Even though recycling has become more of a topic these past years, there is still an undeniable negligence by most people. We wanted to make this process easier and reduce confusion by helping them distinguish between recyclable and non-recyclable trash.
What it does
The app uses live images from the camera and feeds it to the machine, which then judges the object in front of the camera at real time based on the training model that it was set up with. After the model identifies the object the user is given information on what type of waste it is and how they can dispose of it. The user is also given a map with the closest recycling centres to them.
How we built it
Azure Custom Vision and Core ML were the two major tools we used to train and deploy the computer vision model in the application. We collected 100 images for each of 20 different wastes in various lightings, various positions on the phone screen, with different backgrounds. after image sets have been added and tagged, click the Train button at the top to let Custom Vision’s machine learning engine trains a model using the images being fed. Swift was then used to design and code the app.
Challenges we ran into
At first, identifying the objects was a minor problem because of the lighting and other aspects of videography. We fixed this by creating a larger data set for the machine to analyse. This allowed us to add more objects that the machine could identify than we originally planned to. Another challenge we ran to was passing a certain data from our Camera ViewControl board to our informational ViewControl. We surpassed it by using static variables in a clever way.
Accomplishments that we're proud of
No doubt, the accomplishment we're most proud of is the setup of the Machine Learning Model. We all came to this Hackathon with little to none experience in ML, but we liked the idea we came up with, we stuck with it and we made it happen. Another accomplishment we want to mention is that we didn't know each other before coming here and we worked as a unified group from the start till the end.
What we learned
We learned that staying up and coding is worth it. Persistence is one of the key attributes we believe differentiates groups from each other. We were familiar with Xcode and Swift, but incorporating Machine Learning which was new to us, was a great learning experience.
What's next for GreenVision
In the near future, we plan to improve upon our app by expanding the database for our Machine Learning feature. In addition to that, we plan to encourage people to recycle by adding a leaderboard system and prizes. We do this in hope of a cleaner future.
Built With
- azure
- computer-vision
- coreml
- machine-learning
- microsoft
- swift
- xcode
Log in or sign up for Devpost to join the conversation.