The Inspiration Walking through urban neighborhoods, the sight of a perfectly good mahogany dresser or a repairable LED monitor sitting next to a dumpster is heartbreaking. This "waste" is actually a resource, trapped by a lack of visibility. I was inspired to treat the city as a living inventory, a Wild West where treasure is hidden in plain sight.
How I Built It The architecture is a high-performance blend of modern and classic tech: The Vision Pipeline: FFmpeg extracts frames from CCTV streams. These are processed by a Object detection and VLM setup. I used groq API to confirm an item is truly "abandoned." The Backend & Logic: FastAPI acts as the orchestrator. For database logic, I replaced traditional ORMs with Smalltalk, treating data as living objects. These are persisted in SQLite. The Interface: The UI is built with Shadcn, featuring a real-time map using MapLibre GL JS and OpenStreetMap to plot "treasures" the moment they are detected. ElevenLabs provides narrated alerts for accessibility.
What we learned Prompt engineering for classification is genuinely hard. It's not just about being descriptive — it's about anticipating every failure mode and building fences around them.
What’s Next? I've learned that AI isn't just for chatbots; it's a tool for physical sustainability. Real neighborhood pilots. We want to partner with a block association or community org to put this on actual camera feeds, not just YouTube streams. The system is ready, it just needs cameras pointed at real curbs. Wild Waste proves that with the right stack, we can turn "lost" items into found community value.
Built With
- auth
- fastapi
- ffmpeg
- maplibre
- shadcn
- smalltalk
- vast.ai
Log in or sign up for Devpost to join the conversation.