Gazinga! The drone has spotted some nice goodies.
Wowie Zowie, that's some quality depth perception! Good luck stumbling into objects with that kind of clarity!
Drones are cool, identifying objects are cool, and seeing the objects' depth in space is also cool. Wowza! The prime ingredient list for manufacturing the coolest project here at Tribehacks IV.
What it does
The drone can fly around and can identify objects. Furthermore, it detects the depth of the objects. We have a cutesy little web interface that shows the regular camera, the object detected view, and the depth view. Lots o' information at the tips of your fingers.
How we built it
We used a server to pipeline the AR Drone's video feed into python, and used tensorflow to do both depth perception and object detection.
Challenges we ran into
The toughest part was connecting Python to the drone. A second close goes to the ML component and linking it all together.
Accomplishments that we're proud of
Everything, really. Lots of difficult parts that only came together towards the end.
What we learned
Lots and lots of server stuff. We got really good at installing and then deleting useless files. Connecting the drone to Python took a lot of trial and error (and head bashing) but we've learned from it, for sure. Lots of friendly mentors, too, helped.
What's next for Object and Depth Detection with AR Drone 2.0
Using the objects/depth perception, we should be able to create a drone that can fly around itself fairly easily. (Famous last words, right?) In theory, the information could be used to track/follow an object, avoid obstacles, and report to the user what it sees via text. One could even hook it up with Alexa to have her report what's going on in the world of the AR Drone 2.0. Who knows what will be found? To quote the late Frank Herbert: "A universe of surprises, that is what I pray for." And what better way to explore such a universe, than with an object and depth detecting AR Drone 2.0?