We wanted to recreate Cooking Mama in virtual reality but without the use of controllers that make VR more inaccessible. Our "controller" is a colored paper on the back of the user's hand, which is tracked by the library tracking.js!
Try it yourself!
Attach a bright yellow solid object to your hand, and go to the link provided. Move your hand around within view of the webcam! If you have a google cardboard, you can even it experience it in VR! It has to be a cardboard or something similar because our program relies upon the video input from your phone to track the colors. If you do use a cardboard, make sure the phone camera is unobscured.
A few of the challenges we faced were tracking.js being quite laggy, having to manually create depth data, as well as efficiently pushing tracking.js data to aframe.js. We noticed that the greater the contrast between the color being tracked and the background color, the less laggy tracking.js became. We ended up taping a black tableclock to the wall in order to have maximum contrast, which significantly decreased lag. Also, tracking.js does not natively return depth information, only x and y position. In order to simulate depth, we drew rectangles using the coordinates of the controller and compared continuously compared the rectangle of the current coordinates with the rectangle of the previous coordinates. This proved too heavy for our mobile device to handle, so we had to drop depth implementation.
In the future we wish to implement an additional controller for a second hand and more objects, which shouldn't be too complicated as we have laid down the basic framework for this application. Also, optimizing our depth algorithm to more efficiently pass depth data to aframe would be very interesting implement.