Inspiration💡
The primary inspiration for Cosmic Odyssey is the incredible journey of Dr. Ben Carson. From a childhood of extreme disadvantage and failing grades in Detroit, Dr. Carson’s life was completely transformed when he found a spark of inspiration that led him to become a world-renowned neurosurgeon.
I believe that one moment of curiosity can change a child's life trajectory. I didn't just want to build a science app; I wanted to create a "spark generator" for the next generation of pioneers. By making the universe something you can reach out and touch on your Snap Spectacles, we’re proving that no field of science is too distant or difficult to master.
What it does 🚀
Cosmic Odyssey turns your environment into an interactive, 3D classroom.
Reach & Learn: Walk through an accurately modeled solar system. Simply hold your hand over any planet to manifest facts and info—tailored to be bite-sized and kid-friendly. Dwell-to-Select Quiz: Challenge your new knowledge with a gamified quiz! To keep the experience seamless, we’ve implemented a Dwell-Selection System—just hover your hand over an answer for 1 second to "lock it in." AR Interaction: Whether it’s exploring the rings of Saturn or the moons of Earth, every object reacts to your physical presence.
How we built it
I built Cosmic Odyssey specifically for the Snap Spectacles platform using Lens Studio and TypeScript.
Interaction Engine: We bypassed the standard, often-jittery "trigger-on-touch" approach and built a custom Dwell-Timer engine using Physics.ColliderComponent. UX Strategy: We used a hybrid approach of Camera-Locked UI for quick info pop-ups (ensuring facts are always readable) and World-Static UI for the quiz dashboard (to provide a stable, "whiteboard" feel in 3D space).
Challenges I ran into
The biggest technical hurdle was input precision in AR. Hand tracking can be noisy, and standard triggers often fire accidentally.
To solve this, I pivoted to Dwell-Based Interaction. If a child hovers over an option, the text reacts instantly with a "▶" cursor, but the action only confirms if they hold for a full second. I also initially tried to glue the quiz to the user's view, but quickly realized it felt "wonky." Moving it to a static world-space panel immediately made the experience feel more grounded and comfortable.
Accomplishments that I'm proud of
I'm incredibly proud of the reliability of our interactions. Even without a visible hand model, the user always knows exactly what they are selecting because the world reacts to them in real-time. I:ve also curated a massive dataset of 50+ kid-friendly space facts, ensuring the project provides real educational value immediately.
What I learned
I learned that context is king in spatial computing. What works on a flat screen (like a button) doesn't work the same in 3D space when your hand is the controller. Designing for "comfort over speed" led us to rethink how we advance the quiz and how we place our UI elements relative to the user's reach.
What's next for Cosmic Odyssey
AI Voice Assistance: We have a backend ready to go using Google Gemini and ElevenLabs TTS. The next step is a voice-powered astronaut that you can actually ask questions to in real-time.
Make more exhibits to teach even more subjects.
Partner up with NGOs to give access to people across North America.
Built With
- elevenlabs
- gemini
- spectacles
- typescript
- vercel

Log in or sign up for Devpost to join the conversation.