Inspiration

WiggleGram was inspired by classic lenticular and early internet “wiggle” 3D photography — where slight shifts in perspective create a powerful illusion of depth. We wanted to recreate that dynamic effect without bulky multi-rig setups or heavy post-production. The goal was simple: capture depth in a single moment using synchronized hardware.

How We Built It

Our system uses four horizontally spaced cameras mounted along a fixed baseline. By capturing images simultaneously from slightly different viewpoints, we generate parallax — the relative shift in object position as viewpoint changes.

The four images are stitched into a looping animation that cycles between perspectives, producing the signature “wiggle” effect.

What We Learned • The importance of precise synchronization — even millisecond offsets caused visible artifacts. • Mechanical alignment is critical; small angular errors introduced vertical disparity. • Baseline spacing dramatically affects depth perception and comfort.

Challenges • Achieving true simultaneous capture across four modules. • Calibrating intrinsic and extrinsic camera parameters. • Minimizing distortion and stitching artifacts in the final animation. • Packaging four sensors into a compact, rigid enclosure without flex.

Outcome

WiggleGram demonstrates that compelling 3D-style imagery can be achieved with thoughtful hardware architecture and geometric principles — turning static photos into immersive motion experiences.

Built With

  • 3d-printing
  • cad
  • camera
  • onshape
  • python
  • rasberry-pi
Share this project:

Updates