Inspiration
We wanted to build something that mixes AI, hardware, and environmental impact in a way that's actually useful. Everyone throws trash away, but very few people sort it correctly—and most campuses or public places have no idea how full their bins are or how much contamination they get.
trashCam was born from the idea of turning an ordinary trash can into a smart, reactive, AI-powered system that could see what’s being thrown away and help facilities track sustainability metrics.
What it does
trashCam uses an Intel RealSense camera, Raspberry Pi, and a laptop to:
- Detect what trash is being thrown away in real time
- Categorize it into recycling, organic, or general waste
- Estimate bin fill level
- Track mis-sorted items
- Display everything on a retro-styled dashboard
- Control Touch Screen display It’s a full AI powered system that turns a normal bin into an interactive device.
How we built it
We built the system as a hybrid pipeline:
- Camera → Raspberry Pi streams the video using a MJPEG streamer
- Laptop backend runs YOLO-based computer vision to detect trash types
- React Retro UI displays live stats, fill levels, logs, and system health
- Pi Touchscreen Display receives signals from the Pi to display PointCloud
- Logging layer writes data into JSON/CSV for analytics
The entire project is modular, so each component can run independently.
Challenges we ran into
- Getting low-latency video streaming stable on the Pi
- Avoiding duplicate detections for the same falling item
- Balancing model accuracy with speed so it could detect mid-air trash
- Building a retro-style UI while keeping it responsive
- Synchronizing LED feedback with actual detected events
- Weird bugs involving MJPEG boundaries and OpenCV frame drops
- Making sure Pi hardware, camera modules, and LEDs worked together over WiFi
Accomplishments that we're proud of
- A fully working AI trash-classification system
- Clean visual dashboard with animated retro theming
- Reliable LED feedback loop that reacts instantly
- Designing a pipeline that feels like a real product, not just a demo
- Smooth integration between hardware, ML, network streaming, and UI
- Turning a trash can into a functional smart environmental device
What we learned
- How to optimize object detection for real-time streaming
- The quirks of Raspberry Pi video pipelines and MJPEG/RTSP servers
- Designing UIs that communicate data clearly under a themed style
- Handling event deduplication, logging, and system state prediction
- Hardware-software synchronization over WiFi
- Why real-world IoT systems need careful attention to latency and stability
What's next for TrashCam
- Multi-bin support (trash + compost + recycling)
- Mobile app with notifications and analytics
- Voice guidance (“That should go in recycling!”)
- Local tiny-YOLO model running directly on the Pi
- Gamification for environmental engagement on campuses
- Cloud dashboard for long-term sustainability tracking
- Adding depth sensing to estimate volume more accurately
- deployment in residence halls and public campus trash areas
Built With
- computervision
- fastapi
- gemini
- python
- raspberry-pi
- react
- reaslsense
- yolo
Log in or sign up for Devpost to join the conversation.