What It Does
Clarifi AI brings your favorite concerts right to you by overlaying live, real-time lyrics and subtle spatial cues onto your view through Snap Spectacles. It’s a complete AR solution that makes sure even if you’re stuck in the back row, you won’t miss a single beat or word of the performance. In short, it gives you an immersive, augmented experience that transforms how you enjoy live music, making every seat feel like the best seat in the house.
Inspiration
We all know the feeling, sitting in the back of a packed concert, struggling to see the stage and catch the lyrics, while the front row gets all the excitement. We wondered why the best parts of a concert should be reserved only for those closest to the action. Inspired by this everyday frustration, we set out to create a tool that not only brings the live performance closer to everyone but also adds smart features like language translation and accessibility options. Whether it’s breaking down language barriers or ensuring clear, readable lyrics for those with hearing or visual challenges, Clarify AI is all about making the concert experience richer and more inclusive.
How We Built It
We built Clarifi AI by combining the best of hardware and software. Using Snap Spectacles and Lens Studio, we developed a hands-free AR overlay that displays dynamic content. For the live lyrics, we synchronized a pre-recorded concert video with JSON data, ensuring every lyric pops up at just the right moment. Our design process was supercharged by AI-driven tools in Figma, which allowed us to rapidly prototype a minimal, polished UI that’s easy on the eyes. On the web, we put together a simulation using React, Vite, and Tailwind CSS, so you can see the experience in action even outside of the AR environment.
Challenges We Ran Into
Learning to use Snap Spectacles and Lens Studio was a new frontier for us, and getting the real-time lyric synchronization just right under a tight deadline was difficult. All of us were extremely busy over this weekend and there was an insane time crunch.
What We Learned
Working on Clarifi AI taught us that XR is as challenging as it is exciting. We discovered how powerful rapid iteration can be, especially when you combine cutting-edge design tools with real-world hardware. Integrating different technologies, from AR overlays to live lyric syncing, taught us a lot about user-centered design and the importance of accessibility. Ultimately, we learned that even under extreme time constraints, creativity and collaboration can lead to something truly transformative.
Accomplishments We’re Proud Of
at least creating a rough prototype of what would have been!
What’s Next for Clarifi AI: Adaptive AR for Concerts
Looking ahead, we’re excited to take Clarifi AI even further. Our next steps include integrating advanced spatial awareness, possibly through Niantic’s ARDK, to help users navigate crowded venues with ease. We’re also planning to add voice commands and more dynamic adjustments to the live lyric feed, moving to a fully dynamic backend. Ultimately, our vision is to redefine the concert experience, ensuring every fan, no matter where they sit, can feel as though they’re right in the heart of the action.
Log in or sign up for Devpost to join the conversation.