Gaming is great, but you know what would make it even better? If instead of being on a cramped computer or phone screen, your game was actually projected onto the floor of your room! Even better, what if the player avatar was a real, physical robot? It may sound like science fiction, but we wanted to bring this vision to life. And thus, Funball was born.
How it works
Funball operates by coordinating between three physical components - a Kinect, a Sphero robot, and a projector - and python code running on a computer. The projector is responsible for projecting the game field onto the floor, and the python code includes routines that correct for distortions caused by the angle of the projector. The Kinect is used to run an image-processing function to detect the Sphero robot as it moves around the field. OpenCV and numpy routines are used to convert the Sphero's location on the Kinect readings to 2D game coordinates, allowing the Sphero to interact with in-game objects.
Challenges we ran into
Getting the Kinect to properly recognize the Sphero took quite a bit of work. Doing it efficiently to avoid lag was even harder. However, we were ultimately able to piece together a tracking routine that is both accurate and fast.
Accomplishments that we're proud of
While Funball is a great technical demo, it is also an addictingly fun game in its own right!
What we learned
We got a lot of experience with the Kinect API and openCV functions
What's next for Funball
Funball proved that it is possible to get physical and virtual game elements cooperating almost seamlessly. Such technology could be useful in future projects, not just games.