Inspiration We wanted to turn a familiar chase fantasy into a fast, reflex-driven AR experience. The idea of escaping Voldemort in a magical corridor felt perfect for a tension-heavy, tap-based Lens.

What it does Save Harry is an endless runner-style Snapchat Lens where players tap to flip Harry between floor and ceiling to dodge incoming obstacles, while cursed heads taunt and the speed ramps up with every successful escape.

How we built it We built the Lens entirely in Lens Studio using scripted animations instead of tweens, physics-free obstacle movement, quaternion-based flips, and layered micro-animations for instability and menace. Audio, scoring, and difficulty scaling are all handled in real time.

Challenges we ran into Synchronizing rotations during flips, avoiding gimbal issues, and keeping animations stable without tweens took careful quaternion math. We also had to manage Lens Studio quirks around text visibility and component states.

Accomplishments that we’re proud of We replaced all tweens with clean scripting, achieved smooth flip mechanics, created character instability and taunting enemy motion, and delivered a polished, game-like experience fully inside a Snapchat Lens.

What we learned We learned how powerful procedural animation can be in AR, how to manage complex state safely in Lens Studio, and how small motion details dramatically improve player immersion.

What’s next for Save Harry We plan to add multiple levels, visual spell effects, adaptive difficulty, and leaderboard integration to turn Save Harry into a replayable competitive AR game.

Built With

Share this project:

Updates