Inspiration & Goal
Driven by a new interest in game development, we created Eat Up! as our first exploration into AR. We wanted to turn a face filter into a responsive "micro-game" where dietary choices have literal visual consequences using the Face Stretch feature.
How we built it
The lens was developed in Lens Studio v5.17.2 using JavaScript. Interaction: Real-time mouth tracking triggers the "eating" mechanic when food prefabs collide with the head binding. Logic: We used a weight-based system to drive face deformation. Feedback: The UI updates dynamically based on the formula: \(Health = 1.0 - Weight\) This value controls both the health bar's width and the color-shifting status text.
Challenges & Learning
As beginners, we navigated several technical hurdles that deepened our understanding of the platform: The Shared Material Trap: We initially struggled with UI elements "syncing" colors because they shared one material asset. We learned to duplicate assets and use

.getMaterial()

to target specific instances. UI Architecture: Understanding the Orthographic Camera and Screen Transforms was key. Fixing the Pivot settings was a breakthrough, allowing the health bar to shrink from the side rather than the center. API Specifics: Debugging argument count errors taught us the importance of the Lens Studio Scripting API's strict requirements, even for single-material objects.
Quick Specs
Platform: Snap AR (Snapchat)
Tools: Lens Studio v5.17.2, JavaScript
APIs: Face Tracking (Mouth Detection), Face Stretch Visual, Screen Transform UI

Built With

Share this project:

Updates