Motion tracking through a PSEye camera is processed for flapping motions while the user wears an Oculus showing a first person Flappy Bird view. The flapping command is sent to an arduino which ensures a servo taps the smartphone that plays the game. Another PSEye camera captures and processes the phone screen to decode the game. This is sent to another machine over zeromq which creates a 3d projection and sends video to the Oculus. Another copy of the data is also sent to a different machine where a learning algorithm uses local optimizations to make decisions about jumping and sends commands to the Hue accordingly. The user is also photographed and emailed a picture of themselves flapping for the game.

Share this project:
×

Updates