Inspiration

Spark-a-trope is an augmented reality reproduction of an animating Zoetrope utilizing the groundbreaking motion capture photography of Eadweard Muybridge. The zoetrope, invented in the 1830's by Dutch engineer, is a optical toy in which figures are made to revolve on the inside of a cylinder, and viewed through slits in its circumference. The user sees a rapid succession of images, producing the illusion of motion. Muybridge is known for his pioneering work on animal locomotion in 1877 and 1878, which used multiple cameras to capture motion in stop-motion photographs. Rather quaint by today's immersive CGI digital, motion-tracking and display hardware standards but at the time these were state-of-the-art technologies and offered a small glimpse into a wider world.

What it does

The Instagram World AR Effect puts the viewer inside the drum of a zoetrope. When it starts, there will be 12 static frames of an animation loop composed of stop-motion animal images from Eadweard Muybridge work.

SWIPE UP: Swap out different animals (Cat, Dog, Pig, Monkey)

SWIPE RIGHT: Start the Spinning of the images

SCREEN TAP: Start the Blink of the virtual strobe light to capture the spinning images at the correct frame rate.

LONG SCREEN PRESS: Stop the Spinning of the images.

How I built it

Initially, having developed AR with Unity, I was preparing to be doing some javascript development but it turns out I was able to get all the app touch interactive and animating functionality working through using just the Patch Editor. Not one line of javacript was written to make this World AR Effect. The tutorials were helpful in getting me started using the Patch Editor and I was then able to improvise from there. This visual scripting interface has many possibilities for rapid-prototyping AR activations.

Challenges I ran into

Trying to create a traditional gameplay loop using the Patch Editor was difficult since I wasn't able to figure out how to get/set global variables. Each way I tried ended up creating a disallowed logic loop. We had to punt on that and was able to get the different functions to work together using different Pulse triggers to reset different processes. The Swipe Gestures were not well documented and it feels like there probably is a better solution to how I worked it out.

Accomplishments that I'm proud of

This is our first Instagram AR filter and my first time using the Patch Editor in Spark AR. Getting this working fairly close to what I had in mind and seeing the augmented reality virtual zoetrope animate using 150 year old image assets from the very first attempts of motion capture is rather impressive.

What I learned

To develop Instagram AR Filters you need to embrace the limitations of the current state of the technology. The 4MB limit on filesize, the types of mobile touch interactions, the menu of patch editor modules, the amount of time given to work on the project...etc. The solutions developed when faced with such boundaries can be quite interesting and clever.

What's next

We'd like to develop this virtual zoetrope further and release as a Facebook filter with 10MB filesize to include more animated animals and people. Perhaps develop a part 2 for another instagram filter. (Spark-a-trope2?)

Note: Eadweard Muybridge's "Animal Locomotion: an Electro-Photographic Investigation of Connective Phases of Animal Movements" was photographed and published in 1887 and the photographic images within have since lapsed into the public domain. https://www.loc.gov/item/2001704020/ https://picryl.com/media/animal-locomotion-38

Built With

  • patch-editor
  • photopea
  • sparkar
Share this project:

Updates