Inspiration
The Inspiration: Overcoming Emotional Blindness We noticed in our own lives that some activities reduce stress while others simply distract us with short-term pleasure. This observation inspired us to design a system that helps people become aware of both stress and pleasure patterns, enabling more conscious emotional choices. Our goal was to design technology that gives immediate sensory cues, helping users recognize and calm their stress in real time like a “futuristic guardian” guiding them before things escalate.
What it does
Auralyze is an MR-powered wellness tool that translates biological signals like brain waves into real-time visual cues through mixed-reality glasses. By making emotional states visible, it helps users recognize stress early, understand pleasure variations, and receive timely guidance to maintain emotional balance.
Auralyze enhances INTEROCEPTION, the ability to sense internal states like stress or emotional tension. Using sensors and MR visualization, it converts biological signals into real-time cues, helping users perceive emotional shifts that normally occur outside conscious awareness.
How we built it
The project followed the five-step design process. AI tools like Figma Make, ChatGPT and Gemini supported ideation, while Figma was used for UI design and Photoshop and Illustrator for visual assets built with passion and compassion for mental well-being.
Challenges we ran into
One of the main challenges was conceptualizing how something as complex and subjective as human emotion could be represented in a meaningful way. Since the project proposes a system rather than building the sensing technology itself, much of the effort went into research and brainstorming how stress and pleasure could be visualized through MR. Another challenge was exploring new AI design tools like Figma Make, learning how to prompt effectively and generate the desired interface outputs required experimentation.
Accomplishments that we're proud of
One of the main challenges was conceptualizing how something as complex and subjective as human emotion could be represented in a meaningful way. Since the project proposes a system rather than building the sensing technology itself, much of the effort went into research and brainstorming how stress and pleasure could be visualized through MR. Another challenge was exploring new AI design tools like Figma Make, learning how to prompt effectively and generate the desired interface outputs required experimentation. Our accomplishments include exploring AI-driven design tools like Figma Make, strengthening our UI prototyping skills in Figma and developing a complete MR interaction concept supported by refined visual assets.
What we learned
Through this project, we learned to explore AI-assisted design tools like Figma Make and develop effective prompting to generate interface concepts. We also expanded our understanding of human sensory systems and how technology can interpret internal signals. Most importantly, the project pushed us to think beyond traditional mobile and web interfaces and design experiences for mixed reality environments.
What's next for AURALYZE
The next step for Auralyze is to further validate the concept through user testing and research. This would involve refining the MR interface, exploring partnerships with wearable sensing technologies, and developing more accurate emotional interpretation models. Future work would also focus on improving the interaction design and evaluating how real-time emotional feedback can support healthier behavioral patterns in everyday environments.
Ultimately, the goal is to refine further Auralyze into a seamless everyday companion that helps users maintain emotional awareness, reduce accumulated stress, and build healthier emotional regulation habits over time.
Built With
- adobe-illustrator
- figma
- figmake
- figmaslides
- photoshop
Log in or sign up for Devpost to join the conversation.