Have you ever wanted to be in two places at once? Our team, SnackDC, took the oculus rift and decided that it needed a window into the physical world of atoms as well as bits. We modified our oculus rift with cameras that produce fluid 60fps with high quality microphones so you can have externalized photographic and audiographic hardware. People can extend their optic nerves into the lives of others by bridging oculus data with stereoscopic data & webGL. Built mostly entirelly from linux, we wanted to really tackle the infrastructure of the problem and so we tested many things -- from streaming format with ffmpeg and ffserver to RTMP/RTSP and experimenting with software stacks that use openCV algorithms to take in information from the surroundings to give our external hardware some added intelligence. Within the 36 hours, we put together hardware that could take input from 6 different cameras.

We see this being useful where you can access a part of our physical world in 3D by following a web address. Although we did not have enough time to achieve the web portion of our project, we made great progress to making virtual reality over IP so that people could share the presence of being with one another, regardless of time and distance, in an immersive setting. SnackDC brought a bunch of IRC geeks from across the states together to create something awesome. Why not have that all the time, but without all the inconveniences of meatspace?

Share this project:

Updates