Inspiration
I've been fascinated by animatronics for a while, and was looking for an excuse to getting into a good beginner hardware project. This achieved both while still being entertaining.
What it does
The animatronic eye will "look" at a person. The effect is achieved by processing a video input that uses DeepSort API to detect a person, using some math to determine their position, then relaying the data of serial connection to a microcontroller hooked up to servos.
How we built it
I 3D-printed all of the physical components on Friday night, working on learning the DeepSort API for my project. Spent most of Saturday writing on code, working out bugs & trying to get the LCD display to work.
Challenges we ran into
Originally, I had far more planned for the project. I was hoping to incorporate a conversation-driven AI bot, giving life to the eye. Unfortunately, the LCD display I was originally planning to incorporate burned out, making the whole concept of conversing feel really disconnected from the hardware aspect. I elected to drop that aspect of the project and focus on the simpler aspect.
Accomplishments that we're proud of
Went from an idea to a physical, functional device over the course of the hackathon. I finally got a good excuse to use the 3D printer I had purchased recently, and I was additionally able to improve my circuitry skills.
What we learned
Components, even new ones out of the box, are subject to failure. Sometimes it can't be helped and the best course of action is to move on.
What's next for Animatronic Tracking Eye
I would love to take my original idea further, and make an actual animatronic that can converse naturally. With more time, I will begin redesigning from the ground-up, making my own model and mechanics.
Built With
- deepsort
- python
- raspberry-pi
- servo
Log in or sign up for Devpost to join the conversation.