Inspiration
We started with a feeling most of us know: the sense that time is slipping away. A year goes by, and there is nothing to make sense of what you did or the progress you made; you were there, but not really present. That led us to chronoception, the human sense of time. The blind spot being that we have no natural way to perceive the texture or density of our own experiences. Luma is our attempt to make that invisible sense visible for the first time.
What it does
Luma is a speculative wellness tool that turns your lived experience into a visual galaxy unique to you. See a map of your treasured moments in life that have flourished over time. By passively synthesizing biometric data from a wearable system (heart rate elevation, neurochemical spikes, novel movement patterns) with location, photo metadata, and calendar signals, Luma transforms meaningful moments into stars that accumulate into clusters, forming a galaxy. The galaxy becomes a mirror that reflects how it feels to be alive and experience moments that make life worth living, not just a series of chronological files.
Luma steers away from manual logging. Everything is inferred passively from your body and context. Stars represent single meaningful experiences and are each scored by biometric intensity. Star systems automatically name themselves from the dominant thread of shared experiences (a location, a person, a feeling pattern). The galaxy view shows you the aerial perspective on your own life, something previously impossible to see.
How we built it
Luma came together in three main phases: ideation with Claude, wireframing in Figma, and prototype generation with Figma Make. Each phase built directly on the one before it.
Phase 1: Ideation with Claude The idea started as a conversation within our team about whether it was possible to quantify chronoception, the sense of time passing. Before designing anything visually, we used Claude to help expand the concept.
We began with the problem space: there is no tool that captures or visualizes chronoception. Claude acted as a thinking partner as we explored the idea from different angles. What exactly is the sense we are trying to measure? How could meaningful moments be detected passively? What should happen to the data that does not become a “star” moment?
Those conversations helped surface design decisions that are usually left unspoken. For example, we chose to make Luma entirely retrospective. There is no real-time or ambient mode. That choice came from discussing whether a presence aware system would feel supportive or more like surveillance. We decided it would likely feel overwhelming.
Another decision involved how to represent internal data. Instead of showing raw biometric numbers, we explored ways to represent what the body is actually experiencing. That led to the idea of neurochemical proxy signatures. Rather than displaying heart rate or sensor readings, Luma shows dopamine and serotonin curves paired with descriptive labels that reflect the emotional state behind the data.
Phase 2: Wireframing in Figma Once the concept felt solid, we moved into Figma to work out the structure of the interface before introducing visual design.
We created wireframes for four main screens.
Galaxy View established the overall layout. This included the placement of the bottom summary card, the balance between the canvas space and the metrics card, and where the instructional text would appear. One important structural choice was anchoring the universe summary card to the bottom instead of letting it float. This gave the galaxy space to breathe while keeping key information easy to reach.
System View focused on translating the feeling of time passing into something interactive. During ideation, our team kept returning to the question: what would it feel like to hold a star system in your hands? We wanted the stars and galaxies to be the main focus. When users open this screen, they see the star system first. They can rotate and explore it freely, then tap into clusters to look closer at specific moments. Supporting details appear in summary cards that help guide the exploration.
Star Detail began as a layout with multiple cards for each section, including neurochemical signatures, location data, and notes. When we reviewed the wireframes, it became clear that the cards created too much visual noise. Seeing the layouts side by side made the solution obvious. We removed the cards and let the content float directly on the dark background so the moment itself felt more central.
Photo Gallery was designed as a full screen viewer with a thumbnail strip and dot indicator. The interesting part was how it connects back to the star system. Instead of entering the gallery through a separate navigation element, the floating orbs in the star detail screen act as entry points into the photos tied to that moment.
Phase 3: Prototype with Figma Make The final high-fidelity prototype was generated using Figma Make. We wrote a detailed prompt that captured everything we had established during ideation and wireframing.
Instead of a short description of screens, the prompt acted more like a design specification. It included the full color palette with hex values for the five star system colors, both neurochemical colors, and the different text opacity levels. It also described how the galaxy should render, including the glowing sphere background, realistic star glow, and slow auto rotation with drag momentum.
We also specified the card free layout for the star detail view, the orbiting photo system, and interaction rules such as tap behavior and back navigation.
When the output needed adjustments, we updated the prompt rather than editing the prototype manually. Treating the prompt as the source of truth made the design easier to reproduce and kept the intent clear.
The final prototype runs entirely in the browser. It includes four navigable layers, two live canvas renders, a tappable orbit system, and a swipeable photo gallery.
Challenges we ran into
The biggest challenge was the idea itself. We kept circling and proposing different concepts, finding that they either felt too close to something that already existed or didn't genuinely address a sensory experience in a meaningful way. We finally settled on chronoception because it allowed us to explore a sensory experience that is deeply human but invisible. It gave us a way to create a design problem that mattered and could actually change how people perceive and engage with their own experience of time.
Accomplishments that we're proud of
We designed a genuinely new sensory interface for something that has never had one: proprioception of time. We built a privacy model that is structurally sound, not just policy-deep; the data architecture makes misuse impossible by design
What we learned
Speculative design is hardest when the concept is emotionally right but technically invisible. We had to resist the urge to add manual input, social features, and gamification; all the things that would have made it feel more like an app and less like a mirror. We also learned that the best wellness tools don't tell you what to do. They change how you see. Luma doesn't push notifications or set goals. It just makes the texture of your life legible, and trusts that seeing it clearly is enough.
What's next for csm
As we iterated with Figma Make, we encountered challenges in communicating our visual intent. With more time, we would likely refine the spatial fidelity of the galaxy, narrowing the gap between the envisioned experience and what the prototype currently renders. Additionally, we would address inconsistencies in the underlying logic of the system to ensure a more coherent and seamless user experience.
Built With
- chatgpt
- claudeai
- figma
- figmamake
- finalcut
Log in or sign up for Devpost to join the conversation.