Inspiration
In the wake of catastrophe in any form, people need to recover from the disaster mentally and physically. Imagine there is a world where robot therapists bring warmness and physically help people rehab, accompany and entertain people to relieve the great loss from them.
What it does
Our pepper robot learns how to dance using state-of-the-art computer vision algorithm straight from any video, so anyone can dance with it, be amused, and be warmed from the bottom of one’s heart. It is especially useful when the elder people need help with rehab but no human therapist is available.
How we built it
We mainly integrate everything in choregraphe IDE with Python. By everything we mean the things such as using tensorflow to perform human posture estimation, geometrical mapping / transformation of image coordinates and world coordinates, control of each robotic joint, building the motion execution plan of Pepper and etc.
Challenges we ran into
API was quite low level, thus we needed to implement coordinate transformation for several joints and it was tedious. Also, it was slightly annoying to see outdated Python 2 in the API. We also encountered a bit technical difficulty in filming our dancing sample videos.
Accomplishments that we're proud of
We enabled Pepper to imitate human movement to some extent although there is still room for improvement.
What we learned
We have learned Choregraphe (i.e. Pepper’s API) from scratch, including different motion controls and mapping the 2D motions to the 3D world. We have also learned some advanced machine learning algorithms to deliver promising human posture estimation.
What's next for Am I human or dancer? I am a therapist (◑‿◐)
Mapping between the movements from human joints and robotic joints leaves some room for improvement. In addition, a deep neural net model that performs more precisely and rapidly are demanding.
Built With
- choregraphe
- computer-vision
- machine-learning
- pepper
- python
- tensorflow
Log in or sign up for Devpost to join the conversation.