Data visualization and augmented reality are rad, so we're putting them together

What it does

The user wears looks through the headset, and when they hold out their hand a graph appears in their palm that they can rotate and view

How we built it

Duct-taping a webcam, a Leap Motion, and a smartphone to the Google Cardboard headset. We run a python web server on a laptop, and connect to the webpage with the phone, and the image from the webcam is streamed to the smartphone display.

Challenges we ran into

Lining up the coordinate systems between the opencv object that are drawn and the data gathered from the Leap Motion proved pretty much impossible to get perfect. Also, using two cameras to make the image 3 dimensional posed a similar problem of lining up the coordinate systems between cameras.

Accomplishments that we're proud of

Successfully linking input from the Leap Motion to objects generated in OpenCV.

What we learned

How to use the Leap Motion and how to use the Python interface for OpenCV. Also the principles involved in AR.

What's next for Visualize

Using two cameras for a more 3 dimensional display, and improving the latency between the two images.

Share this project: