Inspiration

We've always been fascinated by the invisible forces that shape a room. When you walk into a tense meeting or an energised classroom, you feel it — but you can't quite see it. Humans already subconsciously pick up on emotional cues like tone of voice, facial micro-expressions, body posture, and conversational rhythm. Yet in group settings, these signals are constantly misread or missed entirely.

We asked ourselves: what if you could see the emotional climate of a room?

That question became Auralis — a system designed to give people a new kind of sense. Not to surveil or judge, but to make the invisible, visible.

What it does

Auralis is an Emotional Atmosphere Visualiser that perceives the collective emotional and participation dynamics of a space in real time.

Rather than reading individual emotions, it interprets interaction patterns as a group while detecting signals such as:

  • Speaking time distribution
  • Speech overlap and interruptions
  • Silence duration
  • Vocal stress patterns
  • Body movement and restlessness

These signals combine to form a room dynamic profile, which is then translated into an ambient, living visualisation displayed in the space. This provides something that can be felt rather than analytically read.

Target environments include classrooms, meetings, workshops, therapy rooms, and collaborative workspace. Our goal is for this product to be implemented anywhere that group dynamics matter.

How we built it

We approached Auralis as an experiential design problemfirst, and a technical one second. Our process involved:

  1. Research & Concept Development — Mapping the hidden signals of group interaction and identifying which could be realistically detected.
  2. Visualisation Exploration — Exploring metaphor systems to represent atmosphere. Two leading directions emerged:
    • Emotional Weather — calm discussions render as clear skies; tension as storm clouds; confusion as fog; energetic debate as wind or lightning.
    • Social Constellation — participation dynamics appear as a star network, where balanced conversation produces evenly connected nodes, dominant voices become bright central stars, and quieter participants appear as dim, distant points.
  3. Interface Design — Designing an ambient room display that evolves gradually, avoids dominating attention, and resists reducing atmosphere to simple scores.

Challenges we ran into

  • Avoiding surveillance optics — Any system that monitors a room risks feeling invasive. We had to design carefully around anonymity: no individual identification, only aggregate group dynamics, no stored recordings.
  • Resisting reductionism — It was tempting to display clean scores or meters. But turning nuanced human dynamics into a number felt wrong. Finding visualisations that feel interpretive rather than analytical was a real design challenge.
  • Signal ambiguity — Silence, for example, could mean calm focus or uncomfortable tension. Designing a system that holds that ambiguity honestly, rather than flattening it, required a lot of iteration.

Accomplishments that we're proud of

  • Developing a design framework that centres accessibility and inclusivity — particularly for neurodivergent individuals, people with social anxiety, and those in unfamiliar cultural environments who may find group social cues hard to read.
  • Framing the problem not as emotion detection, but as atmosphere perception — a subtle but important distinction that kept our design ethical and human-centred.
  • Creating a concept that sits at the intersection of experiential design, ambient technology, and social advocacy** for in-person human connection.

What we learned

We learned that designing for social dynamics means designing for ambiguity and nuance. The most interesting and honest systems don't try to give you the answer — they give you a new way of sensing, and trust you to interpret it.

We also deepened our understanding of how much group interaction goes unperceived. The research reinforced that this isn't just a design opportunity — it's genuinely a gap in how people experience shared spaces.

Finally, we learned that the form of a visualisation is an ethical decision. How you show data shapes how people feel about the people that data represents.

What's next for Auralis

Explore hardware integration — Ambient lighting and environmental feedback as an alternative or complement to a screen-based display. -Develop space-type presets — Calibrate the system's sensitivity and visualisation style for classrooms, therapy rooms, and corporate settings respectively.

Built With

  • figma
Share this project:

Updates