The First Time
Inspiration Modern wellness technologies mostly measure what is easy to quantify—steps, heart rate, sleep cycles, or calorie intake. While these metrics provide useful biological data, they overlook a fundamental layer of human experience: our sensory relationship with the environment. Our inspiration came from the structure of human skin, the body’s largest sensory organ. Skin continuously processes temperature, pressure, vibration, moisture, and chemical signals through layered biological systems. Yet we rarely perceive how these signals shape our emotional responses, attention, and behavior. We began asking a simple question: What if we could visualize sensory perception itself? “The First Time” explores the idea that touch and sensory perception could be mapped, observed, and manipulated like other quantified self data. The project imagines an interface where the user can interact with a living biological surface—almost like observing their own skin from the inside. The concept was inspired by: biological diagrams of dermal layers haptic feedback systems medical HUD interfaces interactive scientific visualizations The result is a speculative tool that transforms human sensory experience into an interactive perceptual map.
What it does The First Time is an interactive prototype that simulates a living membrane interface representing human skin as a responsive sensory surface. The prototype allows users to explore how different environmental stimuli affect tactile perception. The interface works at two scales:
Macro View — The Membrane Plane At the surface level, users interact with a biological slab representing skin tissue. The membrane behaves elastically and responds to user interaction: Dragging stretches the membrane Pressing compresses the surface Environmental variables alter its visual and physical properties The interface simulates sensory inputs such as: Heat , Humidity , Oil , Ticklishness These parameters dynamically affect the membrane’s color, texture, and elasticity.
Micro View — Cellular Perspective Users can zoom into the membrane to observe a cellular level visualization. This view reveals: oscillating cells representing biological activity, subtle motion mimicking blood flow and blurred glassmorphism layers representing dermal depth. The transition between views simulates zooming into a pore of the skin, creating a sense of traveling through biological layers.
Perceptual Mapping The system analyzes interactions and generates a Perceptual Map Summary that interprets the sensory state. Instead of raw numbers, the prototype produces contextual sensory profiles such as: The Delicate Guardian The Anchor The Comfort Seeker The Curious Explorer These profiles translate sensory interactions into psychological and situational interpretations, connecting digital touch to real-world emotional contexts.
How we built it The prototype was designed and built entirely inside Figma using interactive components, variables, and smart animation logic. The core system uses a modular component architecture. Living Membrane Component The membrane was constructed using layered visual elements: a soft organic base layer a subtle mesh grid representing biological fibers inner shadows and highlights to simulate depth
Elastic deformation was simulated using Smart Animate transitions between vector states. When the user presses the membrane, the system applies a scale transformation: scaley=0.70 to 0.95scale_y = 0.70 \text{ to } 0.95scaley=0.70 to 0.95 depending on the simulated body region, creating the illusion of biological displacement.
Dynamic Environment Variables Environmental parameters were implemented using Figma variables and interaction states. Each variable alters the membrane's visual system: Parameter Visual Effect Heat shifts color toward warm reds Dryness adds cracked texture overlays Oil increases specular highlights Humidity introduces translucent droplets Ticklishness applies subtle vibration
Multi-View Interaction The system includes two interaction layers: Macro View – deformable membrane Micro View – animated cellular environment Transitions between views use zoom-based navigation, creating the sensation of traveling deeper into biological tissue.
Medical HUD Interface The final interface was structured into a three-column medical dashboard layout: Left Panel - Anatomy console with body map and physics readouts. Center Panel - Primary interaction zone containing the living membrane. Right Panel - Environmental controls, sensory telemetry, and export functions. The interface uses a dark medical HUD aesthetic with cyan accents to emphasize biological data visualization.
Challenges we ran into One of the biggest challenges was simulating organic biological behavior inside a tool primarily designed for UI design. Figma does not natively support physics systems, so we had to approximate elasticity using Smart Animate transitions and carefully tuned scaling behaviors. Another challenge was maintaining clarity in a highly information-dense interface. Early versions of the prototype contained overlapping panels and excessive visual noise. We solved this by restructuring the interface into a modular grid system and introducing consistent typography and spacing rules. We also encountered issues with animation states where undefined values caused interaction errors. These were resolved by explicitly defining starting values for opacity and scale during animation sequences.
Accomplishments that we're proud of One of our proudest achievements is the illusion of biological material behavior using purely interface tools. The membrane convincingly feels elastic and reactive despite being built entirely with vector shapes and Smart Animate. We are also proud of the Perceptual Map system, which transforms raw sensory inputs into meaningful interpretations rather than simple numerical data. Finally, the Medical HUD interface successfully presents complex biological interactions in a clean and readable format while keeping the membrane as the visual focal point.
What we learned Through this project we learned that sensory experience can be treated as an information system. Human perception is not just passive reception—it is a dynamic negotiation between environment, body, and cognition. Designing this prototype also showed us the power of interaction design as a storytelling medium. Even simple animations can convey biological processes and make abstract sensory concepts tangible. Technically, we also gained experience in: advanced Figma prototyping, variable-driven interface design, interaction-based storytelling, designing scientific interfaces for non-experts
What's next for The First Time The current prototype explores a speculative visualization of touch and sensory perception. Future iterations could expand the system in several directions: Haptic Integration Connecting the interface to physical haptic devices to simulate actual tactile feedback. Biometric Data Input Integrating real environmental data such as temperature, humidity, and skin conductivity. Expanded Sensory Models Exploring additional senses such as sound, smell, and proprioception. Machine Learning Analysis Using interaction patterns to generate personalized sensory profiles. Ultimately, we envision The First Time as a tool that helps people understand their own sensory relationship with the world—turning invisible biological signals into something visible, interactive, and meaningful.
Built With
- aftereffects
- deepseek
- figjam
- figmamake
- figmaslides
- gemini
Log in or sign up for Devpost to join the conversation.