Inspiration
In a world inspired by the grit of Mad Max, survival isn't just about finding resources—it's about doing so without losing a limb. We wanted to build a tool for the post-apocalyptic scavenger: an AI-powered companion that handles the dirty work of identifying and retrieving vital scraps from the radioactive dust.
What it does
Scrapple is an autonomous robotic arm controlled via a sleek, intuitive web interface.
Identify: Using a mounted camera and computer vision, Scrapple scans the environment and highlights "scraps" (useful objects) on your screen.
Select: The user simply clicks a scrap in the UI.
Retrieve: Scrapple’s AI takes over, autonomously calculating the trajectory to pick up the item and safely deposit it into a collection bin.
How we built it
We combined rugged hardware with cutting-edge machine learning:
The Hardware: A 3D-printed 6-DOF chassis powered by 6V servos and an internal microcontroller.
The Brains: We used the LeRobot package and trained a reinforcement learning model on Google Colab to imitate human pick-up tasks.
The Vision: YOLO v8 handles real-time object detection.
The Stack: A Flask backend bridges the gap between the AI models and a modern TypeScript/Vite frontend styled with Tailwind CSS.
The AI: We used Google Gemini and Elevenlabs to power a talking AI agent along with our interface. This makes it easier to operate the arm with voice commands, and also keeps you company in a lonely world.
Challenges we ran into
The "Wasteland" started in the terminal. We fought through dependency hell and hair-pulling frontend/backend merge conflicts. Training was a race against time as Google Colab timed out, often threatening to wipe our weights before they could be saved. We also had a brief "security breach" scare involving pushed API keys, proving that even in the apocalypse, DevOps is hard.
Accomplishments that we're proud of
The "Eureka" moment: sending a command from a browser button and watching a physical arm, built from scratch, successfully grasp an object. Seeing the YOLO v8 bounding boxes align perfectly with the robotic movement was our ultimate win. It’s officially ready for the doomsday.
What we learned
The biggest takeaway? Hardware-software integration is the final boss. Building a cool UI is one thing, and a moving arm is another, but making them "talk" to each other in real-time is where the true engineering happens. This was a masterclass in full-stack robotics.
What's next for Scrapple
To make Scrapple the ultimate survivor's tool, we’re looking to:
Hardening the Build: Enhancing the PID loops and model accuracy for more robust, efficient gathering in "unstable" environments.
Auto-Sorting: Once the bin is full, we want Scrapple to use its vision to automatically sort scraps by material—separating the "precious metals" from the "junk."
Mobile Deployment: Moving the backend off the laptop and onto an edge device for true portability in the field.
Log in or sign up for Devpost to join the conversation.