The trash can frame and hardware
People spend a valuable few seconds everyday pausing in front of waste baskets and deciding which bin to throw something into. If you run the numbers...
(2 sec/occurrence * 2 occurrences/day * 365 days/year) * 2oo million people in the U.S. = 80 total human hours spent deciding per year in America!
We sure thought it was about time to do something about it. It's trash Can, not trash Cannot!
We've built a CNN utilizing transfer learning from VGG16 to automatically classify a waste item between the three categories of glass-metals-plastic, paper, or landfill. It then automatically deposits the trash into the correct bin using servos to control rotating platforms.
Our CNN reaches 95% accuracy on a hacked-together validation set of augmented images (for the hackathon only of course)
How we built it
The arduino module utilizes a proximity sensor to sense the offering of a piece of waste, and pings the Rpi to take a photo. Then the Rpi loads and runs the neural network on this input image, and based off of the classification, the correct servos are turned in order to deposit the trash in the right bin.
Blood, sweat, and tears, as well as Keras, Arduino, Raspberry Pi, a bunch of MechE, and lots of aluminum. We also utilized the dataset published by https://github.com/garythung/trashnet, and built the CNN ourselves.
Challenges we ran into
Our raspberry pi couldn't boot on the second day, so we demoed our works-like software on our computers and attached all of the mechanics for the looks-like recycling can side.
Otherwise, properly adapting the RPis to run the NN, physically designing the module, and making all the technologies interface with each other.
Accomplishments that we're proud of
Finding creative solutions to our challenges and helping lessen waste of both the physical and temporal types