AURA — Making the Invisible Atmosphere Visible
Inspiration
Modern environments constantly influence how people feel and behave, yet most of these signals are invisible. Noise levels, lighting, crowd density, emotional tension, and environmental conditions shape our mood, focus, and sense of comfort.
Despite this influence, we rarely have tools that help us understand these atmospheric conditions in real time.
The inspiration behind AURA came from imagining a world where emotional and environmental signals around us could be visualized and interpreted through technology.
What if your phone could scan a space and reveal the invisible atmosphere around you?
AURA explores this idea by transforming environmental signals into a visual experience.
What it does
AURA is a conceptual mobile interface that scans environments and visualizes the atmosphere of a space.
Using a camera-based interface, the system analyzes environmental signals and presents them through an interactive visual layer.
The app reveals:
- Atmosphere intensity — whether a space feels calm, tense, or energetic
- Environmental signals — lighting, sound levels, and spatial density
- Emotional tone indicators — aura waves surrounding people or environments
- Focus vs distraction levels — how suitable the environment is for work or relaxation
These signals are translated into real-time visual feedback, helping users better understand the spaces they enter.
Example uses
Users could:
- Scan a café to see if it’s a good place to work
- Evaluate how calm or tense a meeting room feels
- Identify environments that support focus and wellbeing
AURA turns environmental sensing into a clear visual interface for everyday awareness.
How we built it
The project was developed as a mobile UX prototype exploring speculative environmental sensing.
1. Concept Development
We began by defining the idea of an “atmosphere interface” — a system that translates invisible environmental signals into visual feedback.
2. UX Architecture
The experience was structured around three main screens:
- Live Atmosphere Scanner
- Environmental Analysis Dashboard
- Historical Environment Insights
These screens guide the user from:
Real-time sensing → Environmental interpretation → Personal insight
3. Visual Language
To represent invisible signals, we developed a design language built around:
- soft glowing aura waves
- environmental overlays
- subtle gradients and particle motion
These visual cues communicate atmospheric data without overwhelming the user.
4. Interface Prototyping
The interface was prototyped using modern UX/UI design tools, simulating how a camera-based sensing experience could function on a smartphone.
Challenges we ran into
One of the biggest challenges was designing an interface for a technology that does not yet fully exist.
Since emotional and atmospheric sensing is still largely speculative, we needed to carefully decide:
- how invisible signals should be visualized
- how much data should appear on screen
- how to balance scientific credibility with a futuristic interface
Another challenge was creating a system that feels intuitive rather than technical.
Users should understand the atmosphere of a space instantly without needing to interpret complex data.
Accomplishments that we're proud of
We are most proud of creating a coherent design vision for a new type of sensing interface.
Key achievements
- Designing a clear UX flow for environmental scanning
- Creating a visual system that communicates invisible signals
- Developing a speculative yet believable product concept
- Building a working interface prototype
AURA demonstrates how design can make complex environmental data understandable through visual storytelling.
What we learned
This project reinforced an important lesson:
Design often begins by imagining possibilities before the technology fully exists.
Through this process we learned how to:
- translate abstract ideas into interface experiences
- design visual metaphors for invisible systems
- simplify complex environmental data for everyday users
The experience showed how UX design can connect technology, perception, and human behavior.
What's next for AURA
Future development of AURA could expand the system into a fully functional sensing platform.
Possible next steps include:
- Integrating real environmental sensors (sound, light, air quality)
- Using AI models to detect emotional patterns in environments
- Developing augmented reality overlays for live atmosphere visualization
- Creating community data sharing to map emotional environments across cities
Ultimately, AURA could evolve into a new way for people to understand and navigate the emotional landscapes of everyday spaces.
Built With
- figma

Log in or sign up for Devpost to join the conversation.