About the project: This idea sparked from a deep frustration with how we capture life, phones break the moment, forcing awkward angles and scared-away wildlife. Inspiration hit talking with Ara, an AI pushing me to rethink tech’s role. I learned our eyes are the ultimate camera, pupils dilate and contract naturally, guiding focus, and Meta’s already tracking that for VR clicks. Building it? No hardware needed, just a software tweak to sync iris data to the camera, using existing depth sensors. Challenges? Convincing myself it’s real without coding skills, navigating Meta’s clunky forums on a phone, and dodging their glitchy sign-up maze. But the vision stuck: glasses that film where we truly look, not where our hands point. It’s raw, it’s messy, but it’s ours.
So here: Hi, I’m Rami-this is Iris Zoom Glasses. Meta already tracks your pupils in VR to let you click stuff. I say, link that to the camera. Your eyes widen? Video zooms out-like you’re looking at a sunset. Pupils shrink? It snaps in close-like you’re staring at a flower. No voice commands. No taps. Just your gaze. The depth sensor’s already there-it knows distance. One software update and your glasses don’t just record, they remember where you actually looked. That’s real first-person memory. Make it open, let creators remix it-boom, you just killed phone cameras forever. Do it, Meta.
Built With
- grokai
Log in or sign up for Devpost to join the conversation.