Inspiration
AirBrush started from a simple concept: What if you could control a drone from your phone?
But not in the way you think.
What it does
Many an engineer has created the "phone as a remote control", but have they considered a phone as a spatial instrument? What if motions of "flick" and "swish" translated into a move left and right?
How we built it
There are (2) main approaches we considered with a drone:
an "open" system drone in which the drone acts as a combinational machine of inputs solely. This means that spatial motions of a device (a phone) translate into some "simple" movement of the drone.
a "closed" system drone in which we use 3d spatial detection/reasoning to maintain the position of the drone, and use information about the drone's position to make decisions about the drone's next state. Essentially a much more controlled drone.
With a (n-un) healthy amount async python, and some surgery to reverse-engineer the remote control's of some cheap drones we found lying around, we were able to successfully shoot commands from an external source into the drone control matrix (aka the controller).
Challenges we ran into
Key challenges faced were being able to map the drone somewhere in 3d space. By utilizing a simpler approach, ye olde camera-based tracking, we are able to integrate the drone with much more limited hardware. Integration of a large amount of systems proved quite difficult as well, with certificates out the wazoo being required for maintaining secure HTTPS sockets for spatial data transfer between phone and webserver.
The biggest issue we faced as PID. Trying to keep the drone steady in a place was nearly impossible with the drone quality we chose, however countless hours (~6-7) spent on calibration inched us ever closer to a "stable" system.
Accomplishments that we're proud of
We are quite proud of the various features that we got working on their own, as well as the innovations we took to try and mitigate the inefficiencies of graciously provided hardware. By doing odd things such as emulating the SEEED ESP32S3's as USB webcams via UVC, we were able to achieve much higher resolution, lower latency images to feed our imaging pipelines.
What we learned
We learned that sometimes it really is just _ that_ hard to hack something, but with some creative juices flowing (also known as Monster Energy / Celsius), and a touch of Claude Code (sorry Codex), you can achieve anything in a rather short period of time.
What's next for AirBrush
Painting da Mona Lisa
Log in or sign up for Devpost to join the conversation.