We wanted to create a friendly robotic companion while learning computer vision and working with SDKs.
What it does
Cheer Bot recognizes facial expressions and reacts empathetically to those near to him. If it senses that you are happy, it will joyfully spin in a few circles to celebrate. If it believes you're sad, it will try it's best to comfort you by nuzzling close to you.
How we built it
We built it in separate modules and combined it together later.
- Raspberry Pi
- TFT SPI Display
- DC Motors
- Voltage controller (3.3-3.6V)
- Logitech Camera
- 22,000 mAH External Battery Pack
- Zip ties, wires, breadboard
The Hardware Involved wires, breadboards, using motor controllers, and lots of double sided tape.
The Camera We used A12Go SDK to work with their emotion recognition functions.
The Graphics We used adafruit libraries to display the faces on the rover, that would be determined by the emotion recognition output. The faces, neutral, happy, sad were made using MediBang Paint
Challenges We ran into
- Balancing weight of all components
- Numpy (So many problems, we're too exhausted to explain)
- Wiring issues with the motor controllers
- Debugging code
- Had trouble gaining access to the Pi due to display limitations
Accomplishments that we're proud of
- We made a functioning bot!
What We learned
- Working with different components with combining hardware and software
What's next for cheerbot-ai2go
- Wider range of emotions, with audio reactions, and a faster response time.
We are all Computer Engineering students from the University of Washington-Bothell and this is our first time at DubHacks!