Inspiration
The bins in our university dormitories were a mess. Even when people genuinely wanted to recycle, the friction of figuring out which bin to use meant most waste ended up unsorted. We wanted to remove that decision entirely — what if the bin just knew?
What it does
UniBin is a smart waste bin that automatically sorts everything placed on its lid. A mounted camera and an AI vision model identify the type of waste in real time, then motors pivot the lid to dump it into the correct compartment. Ultrasonic sensors inside each section track how full it is.
The companion Streamlit dashboard lets users monitor all of their UniBins remotely — seeing live fill levels, when each compartment was last emptied, and a log of every item that's been sorted.
UniBin eliminates the guesswork and friction that make manual waste sorting so ineffective, lowering the barrier to a cleaner, greener campus for everyone.
How we built it
The system is split into four cooperating services connected through a central Flask API server:
Camera classifier — We use OpenAI's CLIP (ViT-B/32) for zero-shot image classification. A phone camera streams live video over WiFi via MJPEG; our Python script reads each frame, encodes it alongside curated text prompts for each waste category, and picks the highest-confidence match. No training data or fine-tuning required — we just describe what belongs in each bin and CLIP figures it out.
Flask mailbox server — A lightweight REST API acts as the central hub. The camera POSTs the detected bin number, the serial bridge polls for it, and the dashboard fetches live fill-level data. Everything talks through simple JSON endpoints.
Arduino serial bridge — A Python bridge script connects the server to the physical hardware over USB serial. It forwards bin numbers to the Arduino (which drives the servo motors to pivot the lid) and reads back
FILL:xx.xultrasonic sensor readings, posting them to the server.Hardware — An Arduino Uno drives the sorting mechanism and reads an HC-SR04 ultrasonic sensor to measure how full each compartment is. The physical bin and lid mechanism were designed in CAD and 3D-printed.
Streamlit dashboard — The companion web app polls the server and uses a version-counter mechanism to silently update the UI only when new sensor data arrives, avoiding disruptive full-page reloads.
The entire pipeline — camera to classification to physical sorting to live monitoring — runs end-to-end with four terminal commands.
Challenges we ran into
Bridging the gap between the Python AI stack and the Arduino microcontroller world was the hardest part. Getting USB serial communication reliable across different Arduino and ESP32 boards, handling connection drops gracefully, and keeping the whole pipeline in sync without race conditions took significant iteration. Formatting sensor data so every component could parse it cleanly (FILL:xx.x over serial → JSON → Streamlit) was deceptively tricky to get right.
What we learned
This project was a crash course in full-stack IoT — integrating a machine learning model, a REST API, serial hardware communication, 3D-printed mechanical design, and a live web dashboard into one cohesive system. We learned how to make software and hardware talk to each other reliably, how to design simple but effective inter-process communication protocols, and how to collaborate across very different skill sets under time pressure.
What's next for UniBin
- Weight sensors for more accurate fill detection that complements the ultrasonic readings
- A fine-tuned or larger vision model to improve classification accuracy on edge cases and support fully custom compartment categories
- On-device inference running directly on an ESP32-CAM to eliminate the need for a separate computer
- Multi-bin fleet management with historical analytics, collection scheduling, and alerts when compartments are nearly full
Log in or sign up for Devpost to join the conversation.