Inspiration
Sensory overwhelm is something many people quietly navigate every day, from neurodivergent individuals managing overstimulation to anyone trying to stay grounded in chaotic environments. We were inspired by research on cross‑modal perception, which shows that the brain doesn’t process senses in isolation. Sound can influence how we see colour, motion, and even emotional tone.
We wondered:
What if technology could gently rebalance those sensory channels in real time?
What if AR could feel less like a screen and more like an aurora that adapts to you?
What it does
Sensora is a new way to move through the world with clarity and comfort. It’s a mobile companion app paired with AR smart glasses that gently reshape how you experience your environment. Using cross‑modal perception, Sensora listens to the sounds around you and adjusts the colours you see in real time, softening overstimulation or enhancing awareness when you need it most. The app guides you through setup, calibration, and personalized sensory modes, creating an experience that feels intuitive, accessible, and beautifully attuned to you. Sensora brings the calm glow of an aurora into everyday life, helping you navigate the world with balance and ease.
How we built it
We designed the entire experience in Figma Design, focusing on calm visuals, accessible flows, and an aurora‑inspired aesthetic. Using Figma Make, we prototyped real‑time interactions and sensory transitions to validate the emotional tone early. Technically, we built an audio analysis pipeline, a colour‑modulation engine, and Bluetooth communication with AR glasses — all grounded in neuroscience research on multisensory integration.
Challenges we ran into
Designing a UI that responds to real‑world sensory input was far more complex than we expected. Sound is unpredictable as it spikes, fades, and shifts in ways that don’t always map cleanly to visual comfort. Translating those fluctuations into colour changes without overwhelming the user required careful tuning. We also had to consider how differently people perceive colour and sensory intensity, which meant building calibration tools that felt supportive rather than technical. Balancing responsiveness with emotional safety became one of our biggest design challenges, especially when creating an interface that adapts in real time without ever feeling chaotic or intrusive.
Accomplishments that we're proud of
Completing this project in one weekend with our busy schedules was what we were most proud of! We’re proud of creating a cross‑modal AR experience that feels gentle, intuitive, and emotionally supportive. Our Figma prototypes helped us craft an interface that users described as calming, and our sensory model is grounded in real neuroscience. Most of all, we built something that helps people feel more at ease in their everyday environments.
What we learned
We learned that designing for the senses requires a completely different mindset,one rooted in empathy, patience, and deep listening. Working with real‑world sensory input taught us how unpredictable and personal perception can be, and how small visual shifts can dramatically change someone’s comfort. We also discovered the power of prototyping early in Figma: it allowed us to test emotional tone, accessibility, and sensory flow long before development. Most importantly, we learned that meaningful technology isn’t just functional, it supports people in the moments when the world feels overwhelming.
What's next for Sensora
Next, we plan to add adaptive learning, colour‑blind accessibility modes, and haptic feedback.
Built With
- figma
- html5
- javascriptp5
Log in or sign up for Devpost to join the conversation.