4-lenses Lomos. Analog film stereoscopic photos. Old-school 3d glasses.

How it works

A group of people (up to 7) open the Stereogram app on their iPhones (Android app pending) and point the camera at the same subject, from multiple perspectives. If one person takes a picture, all the phones take a picture at the same time. All the images are uploaded to the server and stereoscopic GIFs (and optionally a 3D anaglyph) are generated, tagged using the Imagga API, so users can search and filter them.

The server also provides a JSON API and a basic web client so the users can list, search and see the generated images.

Challenges we ran into

Syncing the phones. We thought of and tried a bunch of different approaches to trigger all the shutters:

  • producing a sound with a specific frequency and listening on all the devices (didn't work well on noisy environments);
  • syncing the clock using NTP and scheduling a time to take the picture (laggy)
  • sending a callback from the server (even more laggy)
  • finally settled on using the iOS multipeer connectivity framework that uses BT 4.0 or WiFi to connect devices and has an "unreliable" method, akin to UDP, which fires and forgets.

Generating the images:

  • The GIF generation was pretty simple, but the anaglyphs took a bit more time to nail.

Accomplishments that I'm proud of

We're proud of actually managing to build the system and getting some great results with it:

Some of our favorite stereoscopic images:

Some of our favorite 3D anaglyphs (you need red-green 3D glasses to check the effect):

What I learned

How to sync multiple iphones and make them talk to each other instantaneously. How 3D anaglyphs work. People love moving pictures more than stills!

What's next for Stereogram

Improve iOS app usability. Tweak the web client version. Build an android app. Get people using it!

Share this project: