Inspiration
Human interactions often fail not because people lack kindness, but because they lack awareness in the moment. We frequently hear the phrase _“be kind, you never know what someone is going through.” _Yet in daily life, emotional signals like stress, anxiety, or pressure remain invisible.
This inspired us to explore a speculative design question: What if emotional signals could become perceivable?
At the same time, many people experience intense social pressure when presenting ideas in front of others. Fear of judgment often prevents people from speaking confidently.
EmpathyLens was created to address both challenges by introducing a new sensory layer that helps people notice emotional signals around them and reduce the fear of being observed.
What it does
EmpathyLens is a speculative wearable AR lens that translates subtle emotional signals into visual cues.
The system introduces two modes:
Empathy Viewer
Detects emotional pressure through subtle signals such as posture, facial tension, and vocal patterns. These signals are visualized through color-coded halos:
Green — calm
Yellow — moderate stress
Red — high emotional pressure
The lens also provides gentle prompts such as:
“Pause before reacting.”
These prompts encourage empathetic responses during everyday interactions.
Confidence Mode
In situations like presentations or meetings, the lens reduces social pressure by transforming intimidating audience faces into playful avatars (such as emojis or vegetables).
This helps users focus on what truly matters — their voice and ideas.
The companion app provides:
-Interaction timelines -Emotional heat maps -Empathy moments -Growth tracking through achievements and an empathy garden
How we built it
EmpathyLens was developed as a speculative interaction prototype using Figma and Figma Make.
We designed:
-A wearable AR lens concept that visualizes emotional signals through subtle halo indicators.
-A mobile companion app interface built in Figma, allowing users to connect the lens, switch between modes, and view insights.
-Interactive flows using Figma Make, enabling us to simulate onboarding, mode activation, and the empathy insights dashboard.
The interface focuses on minimal, meaningful cues such as color-coded halos and gentle prompts, ensuring users receive helpful emotional awareness without information overload.
Through these prototypes, we demonstrated how augmented emotional perception could integrate into everyday social interactions.
Challenges we ran into
Designing emotional awareness technology raised important questions.
One major challenge was ensuring the system guides users without misinterpreting emotions. Emotional signals are complex, so the lens presents cues as suggestions rather than definitive judgments.
Another challenge was avoiding information overload. Too many emotional signals could become distracting. We addressed this by designing minimal AR cues and adjustable prompt frequency.
Balancing speculative technology with believable user experiences was also an important design challenge.
Accomplishments that we're proud of
We are proud that EmpathyLens explores emotional perception in a human-centered way.
Instead of focusing purely on technology, the design emphasizes:
- Empathy in everyday interactions
- Reducing social anxiety
- Encouraging supportive communication
The project demonstrates how a speculative sensory tool could improve both personal confidence and interpersonal understanding.
What we learned
Through this project, we learned how important interaction design and ethical considerations are when introducing new forms of perception.
We explored questions such as: -How much information is helpful versus overwhelming? -How can technology encourage empathy without replacing human judgment? -What safeguards are needed when interpreting emotional signals?
Designing EmpathyLens helped us understand the balance between technology, perception, and responsible design.
Built With
- figma
- figmamake

Log in or sign up for Devpost to join the conversation.