Inspiration
Sprout was inspired by my wife. I originally started this project as a small desk companion robot, loosely inspired by the Star Wars Gonk droid. I used this Thingiverse model as the physical base: https://www.thingiverse.com/thing:1568652
As I printed the parts, wired the electronics, and began programming basic movement, my wife casually suggested that it could move her plants around the house. That idea immediately felt more interesting and useful than my original plan, so I pivoted.
I didn’t want Sprout to feel like just another gadget. I wanted it to feel like a small, opinionated part of the household. That’s what led me to give it a voice and a bit of personality.
What it does
Sprout is an IoT companion robot that autonomously moves plants around a desk or counter and interacts with you through sound.
It uses:
- A raspberry pi pico microcontroller and servo motors for locomotion
- MQTT for messaging and control
- Text generation via the Gemini API
- Voice output via the ElevenLabs API
Sprout walks around, relocates plants, and provides lightly sarcastic commentary while doing so.
How we built it
The build process was… iterative.
I started by digging through old microcontroller boards until I found one that still worked. From there, I spent a lot of time calibrating servos and adapting them to the 3D-printed parts. At one point, servos were literally cut apart and jumper wires were duct-taped back together just to test motion.
Eventually, I got a rough walking gait working. That was followed by a lot more tuning, frustration, and experimentation.
I briefly attempted hand tracking, but ran into repeated Arduino IDE crashes late at night and decided to shelve that feature for later. I also spent more time than I’d like to admit experimenting with ElevenLabs voices to find the right tone for remotely sassing my wife while her plants moved around.
In the end, I made the conscious decision to cut features rather than force everything in at once. Features cut for this version include:
- Hand tracking
- On-device speaker hardware
- Sleep
- General sanity
Challenges we ran into
- Repeated Arduino IDE crashes caused by the hand detection model
- Servo tuning and mechanical alignment
- Managing a growing nest of tiny wires without losing connections
- Resisting the urge to keep adding features instead of stabilizing the core
Accomplishments that we're proud of
- Developing a repeatable process for tuning servos that will help future robot builds
- Setting up a complete MQTT pipeline that allows a Raspberry Pi–based bipedal robot to be controlled by another microcontroller or any external program ## What we learned
- Tune servos first. Always.
- Add AI and perception models only after motion and control are stable
What's next for Sprout AI
The list for this goes on and on, but first, I want to finish what I have and get a demo out. I just ran out of time and didn't get to finish up everything. The server is not working right now and needs some tuning to get everything communicating as expected. But once that is done, I want to start with a custom PCB so that I don't have a rats nest of wires. Add a speaker circuit so the robot can talk it self, either set up hand tracking or light tracking. Add some obstacles and edge detection so it can survive. Give it a head and replace the cardboard with some stronger materials to hold heavier plants.
Built With
- elevenlab
- gemini
- node.js
- python
- raspberry-pi
Log in or sign up for Devpost to join the conversation.