Inspiration

Humans constantly influence each other emotionally, but we rarely notice it happening. Walk into a tense meeting and everyone becomes quieter. Spend time with a joyful friend and your mood lifts. This phenomenon—emotional contagion—shapes our wellbeing, relationships, and productivity, yet it remains largely invisible.

We were inspired by the idea that humans likely possess more sensory abilities than the traditional five senses. One of these emerging concepts is social or emotional sensing: the subtle cues we pick up from body language, tone, and group dynamics.

Resonance imagines a future where this hidden layer of human interaction becomes visible. Instead of guessing how a space feels emotionally, people could perceive and navigate social energy intentionally, improving mental health, collaboration, and empathy.


What it does

Resonance is a speculative “empathic field detector” that tracks and visualizes emotional contagion and group energy dynamics.

Using future ambient sensors embedded in wearables and environments, the system analyzes signals such as micro-expressions, vocal tone shifts, physiological stress markers, and spatial proximity. These signals combine to generate a real-time emotional field map of a space.

The tool allows users to:

  • Perceive group emotional energy that would normally be invisible
  • Track emotional contagion as moods spread through a room
  • Receive interventions that help stabilize or improve social environments

The system translates these signals into a new sensory layer called Resonance, visualized as a dynamic field representing collective emotional states like calm, tension, curiosity, or fatigue.

The goal is to help people understand how environments affect them emotionally—and how they affect others.


How we built it

We designed Resonance as a speculative multi-surface interface rather than a traditional single app.

Our prototype was created in Figma, focusing on interaction design and user experience. The system includes three main components:

1. Personal wearable interface A lightweight wearable detects biometric signals such as heart rate variability, breathing patterns, and micro facial cues.

2. Spatial emotional map The interface visualizes emotional energy in shared spaces through a live Resonance Field, showing patterns of calm, tension, and engagement across a room.

3. Insight and intervention layer Users receive contextual insights and subtle interventions, such as breathing prompts, meeting pacing suggestions, or environmental adjustments.

We focused heavily on data visualization, ambient feedback, and low-disruption interaction design so the system augments human awareness without overwhelming the user.


Challenges we ran into

One of the biggest challenges was designing a system that introduces a completely new sense without overwhelming users with information.

We had to carefully consider:

  • How emotional data should be visualized without feeling invasive
  • How to surface insights at the right moment
  • How to prevent the tool from becoming distracting during real-world interactions

Another challenge was addressing the ethical implications of emotional sensing. If people can detect the mood of others, it raises questions around privacy, consent, and potential misuse.

Designing safeguards became a core part of the experience rather than an afterthought.


Accomplishments that we're proud of

We’re proud that Resonance goes beyond being just another wellness tracker. Instead of measuring individual metrics like steps or heart rate, it explores an entirely new dimension of human awareness: collective emotional dynamics.

Key accomplishments include:

  • Designing a new sensory interface for emotional fields
  • Creating a visualization system that communicates complex social data intuitively
  • Integrating privacy and consent safeguards into the core experience
  • Building a concept that supports mental, emotional, and social wellbeing

Most importantly, the project reimagines how technology could help people become more empathetic and socially aware, rather than more isolated.


What we learned

This project pushed us to think about technology not just as a tool for efficiency, but as a way to expand human perception.

We learned that designing for a “new sense” requires thinking differently about interaction models. Traditional dashboards and metrics don't work well for emotional experiences. Instead, we had to focus on ambient cues, subtle feedback, and contextual insights.

We also realized how important ethical design becomes when dealing with sensitive human signals like emotions.


What's next for Resonance

If Resonance were developed further, we would explore several directions:

1. Group wellbeing tools Applications for workplaces, classrooms, and teams to monitor and improve collective emotional health.

2. Mental health support Helping users recognize environments that increase stress or emotional fatigue.

3. Adaptive environments Spaces that respond automatically to emotional signals—adjusting lighting, sound, or layout to support wellbeing.

4. Stronger privacy frameworks Developing consent-based emotional sensing systems where individuals maintain full control over their data.

Ultimately, Resonance imagines a future where emotional awareness becomes a shared capability—helping people navigate social environments with greater empathy, clarity, and care.

Built With

  • figma-make
Share this project:

Updates