๐ Inspiration
It had been years since some of us had last been together in person, and Hack the North gave us that chance. Old friends reconnected ๐ค, and through Raed we met Allen, who quickly became part of the group. By the end of the weekend, we were no longer just teammates; we were friends building something meaningful together.
As we caught up, we realized how much of our friendship had always been built around storytelling ๐: sharing memories, retelling old moments, and imagining new ones. Stories are more than just words; they are how we reconnect, how we preserve identity, and how we relive what matters most.
That realization sparked SpectraSphere โจ. We wanted to make stories something you can not only tell but actually step inside. With Snapchat Spectacles ๐, we set out to reimagine how people experience their own memories, ideas, and imagination. Instead of scrolling through flat images, why not live inside them?
๐ ๏ธ What it does
SpectraSphere turns your prompts and images into immersive AR stories ๐ that you can step inside using Snapchat Spectacles.
From the userโs perspective, it feels simple and magical:
- On our TypeScript web app ๐ป, you start by writing a story prompt or narrating one with your voice ๐๏ธ.
- You then upload four images ๐ท that capture the essence of your story.
- With a single click, those inputs are sent directly to your Spectacles ๐ก.
- Inside Spectacles, your photos appear around you in a rotatable 3D environment ๐.
- You can instantly switch between creative animation styles ๐จ such as Ghibli, cyberpunk, comic, and more each reframing your story through a different โspectra.โ
Behind the scenes, SpectraSphere works by combining multiple technologies:
- Cohere API ๐ง expands and contextualizes the story prompt so the AR experience feels richer than raw text.
- Gemini API ๐ manages image generation and styling to adapt uploaded photos to different artistic modes.
- Lens Studio SDK ๐ถ๏ธ and Spectacles API bring the story into AR, rendering the 3D environment and enabling smooth style transitions.
The result is a pipeline that moves seamlessly from prompt to photos to immersive AR storytelling.
Right now, SpectraSphere is designed for photos. Looking ahead, we envision extending this to generational video, evolving narratives, and interactive branching stories where memories come alive as moving, dynamic experiences.
โ๏ธ How we built it
We approached SpectraSphere as a full pipeline, from frontend interaction to AR rendering, with careful attention to performance on Snap Spectacles.
Frontend and Backend + APIs
We built a TypeScript web app ๐ป as the entry point for storytelling. It handles both text and voice input, manages image uploads, and packages everything into a structured JSON format. The frontend connects directly to Cohere and Gemini through API calls, and performs validation before sending data downstream. We also designed a lightweight Node.js backend ๐ to coordinate between APIs and our AR pipeline:
- Cohere is called first to enrich prompts, producing narrative-friendly text.
- Gemini is then used for media handling, applying transformations and style presets to the uploaded images.
- The backend includes custom error handling, request batching, and asset optimization to keep responses fast enough for real-time use inside Spectacles.
AR environment
Inside Lens Studio ๐ญ, we built a system for displaying static assets as immersive 3D environments. This includes:
- A panel management system that positions and rotates user images in space.
- A style-switching module that applies multiple artistic filters on demand.
We tuned these features for stable performance on Spectacles hardware, with optimizations like power-of-two texture scaling and lightweight shader effects.
Testing and validation
Because hardware time was limited โฑ๏ธ, we built a local Node.js testing suite ๐งช to simulate the pipeline. This allowed us to:
- Validate API integration with real Cohere and Gemini calls.
- Benchmark asset loading, memory use, and frame rate.
- Confirm Lens Studio compatibility before deployment.
๐ง Challenges we ran into
Stepping into AR for the first time ๐ถ๏ธ
None of us had built for AR or VR before Hack the North. Only Allen had prior Unity experience, and that became our bridge into Lens Studio. The rest of us had to pick things up from scratch: how components work, the strict inheritance rules in TypeScript files, and how Lens Studio expects scripts to interact with 3D objects. The first hours were frustrating, filled with broken builds and cryptic errors, but gradually we adapted and started thinking in terms of AR instead of web.
Gemini integration ๐
Making the Gemini API key compatible with Snap Spectacles was another major hurdle. Unlike a standard web app, Spectacles access APIs through the Remote Service Gateway, which has strict rules about authentication and security. What should have been โone line of codeโ turned into an entire debugging session across both the Spectacles environment.
Limited hardware access โณ
Our time with physical Spectacles was limited, which meant most of development happened blind. To work around this, we built a local testing setup that mimicked the pipeline as closely as possible, complete with mocked API responses and asset loading checks. While this kept progress moving, it also forced us to make educated guesses about real-world performance until late in the build.
Hackathon constraints โฐ
We also hit the universal hackathon hurdles: sleep-deprived coding sessions, APIs that did not always behave as expected, and tough tradeoffs. For example, we wanted to experiment more deeply with generative video, but Geminiโs rate limits and our time window meant focusing on photo panels and text to speech instead of video generation.
๐ Accomplishments that we're proud of
Bridging web and AR environments ๐
One of our proudest achievements was successfully connecting a modern web stack with Snap Spectacles. Moving data from a TypeScript application into Lens Studio required us to reconcile two very different development paradigms. We built a consistent pipeline that packages user prompts and images in a format Spectacles could reliably interpret. This integration made it possible for Spectrasphere to feel seamless to the user, despite the complexity happening under the hood.
Overcoming Gemini integration barriers ๐
Getting Gemini to work within the constraints of the Spectacles environment was another milestone. Through the Remote Service Gateway, we had to solve authentication issues, handle asynchronous responses, and adapt outputs for rendering in Lens Studio. By the end of the hackathon, we had a functioning system where Gemini-generated content flowed directly into AR, a result that required both persistence and careful engineering.
Rapid mastery of Lens Studio โก
Most of the team had never touched AR before, yet in under two days we went from confusion to competence with Lens Studio. We learned how components, inheritance rules, and TypeScript scripts fit together in a 3D space. By the final demo, we had a working AR environment that positioned panels correctly and allowed creative style switching. The speed at which we adapted to these tools is something we take pride in.
๐ What we learned
Planning before executing ๐
We discovered quickly that jumping straight into development without mapping out the user flow created unnecessary setbacks. Early attempts to link the frontend, APIs, and Lens Studio led to broken states and confusing errors. Taking a step back to plan the pipeline, from prompt to images to AR rendering, proved essential. This reinforced how important architecture and data flow design are, even under hackathon pressure.
Simplifying the stack ๐งฉ
It was tempting to keep adding new technologies to cover gaps, but we learned that simplicity is often more powerful. Instead of experimenting with extra services, we focused on making Cohere, Gemini, and Lens Studio work reliably together. Keeping the stack lean allowed us to spend more time polishing the experience and less time wrestling with unnecessary complexity.
Rapidly learning new tools ๐
For most of the team, Lens Studio was completely unfamiliar. We had to quickly adapt to its component system, TypeScript scripting rules, and the way it handles 3D environments. This showed us that even without prior AR experience, strong programming fundamentals and a willingness to learn can get you from zero to a working demo in a short time.
๐ฎ What's next for HTN
Seamless integration with Snapchat Memories ๐ฒ
One of our first priorities is to connect SpectraSphere directly to Snapchat Memories. Instead of manually uploading images, users could select past experiences already saved in their accounts, and instantly relive them in AR. This would allow stories to be created and experienced with minimal friction, making SpectraSphere feel like a natural extension of everyday life.
Expanded creative toolset ๐จ
We plan to broaden the style library far beyond the initial Ghibli, cyberpunk, and comic filters. Future iterations will allow users to layer effects, mix artistic aesthetics, and even train personal styles unique to their own memories. By giving users more control over how their stories look and feel, SpectraSphere becomes not only a storytelling platform but also a creative studio.
Interactive and collaborative narratives ๐ค
Storytelling becomes even more powerful when it is shared. We envision interactive narratives where users can make choices that branch the story in real time, or invite friends to co-create an experience together. Imagine building a collective trip diary, where each personโs photos and memories merge into a single immersive AR environment.
Applications beyond personal storytelling ๐
While SpectraSphere began with memories, the same platform can support education, travel, and entertainment. Teachers could create explorable history lessons, travellers could share experiences as immersive journals, and artists could experiment with interactive AR exhibitions. By positioning SpectraSphere as a flexible framework, we open possibilities across many domains.
Built With
- cohere
- gemini
- google-cloud
- json
- lens-studio-sdk
- postman
- remote-service-gateway
- snap-spectacles
- snapchat
- typescript




Log in or sign up for Devpost to join the conversation.