Trying to get out of our physical bodies to experience the world differently

What it does

Undeterred in the Third enables you to see yourself in the third person using a separate camera feed.

How we built it

We set up a fisheye camera on one phone to capture ~160º FOV, serving an rtsp stream. An app then runs on the phone in the Cardboard headset to stream the video and map it to a sphere in VR.

Challenges we ran into

Mapping between the fisheye projection and coordinates on the surface of a sphere: This took a few tries, googling and glsl black magic.

Latency: By default, the android media api does not allow setting the network buffer size. This meant that while the video was smooth, there was ~7s of latency. By using LibVLC instead, we have choppier video but with only ~0.5-2s latency, depending on network conditions. We used a portable wifi hotspot to try to improve the network speed as well.

Accomplishments that we're proud of

Getting it to work

What we learned

Effectively collaborate on generating ideas + lots of GLSL

What's next for Undeterred in the Third

We wonder: if the latency could be reduced even further, could we use it to study phantom pain?

Built With

Share this project: