Inspiration
The inspiration for Aeon Sync was born out of a team mate's own daily reality. Living with neurodivergence, they have always struggled with "Time Blindness" the invisible, frustrating gap between the time on the wall and their internal perception of it. For them, time isn't a steady stream; it is a fog that either stands still or rushes by in a blur. They wanted to build a tool that didn't just tell them the time, but helped make sense it.
What it does
Aeon Sync is a bio-responsive speculative interface designed to bridge the gap between biological reality and subjective experience. It acts as a "sensory anchor" for individuals struggling with Time Blindness and circadian misalignment.The app functions through a three-tier system: Identify: It tracks the Temporal Drift by comparing the user’s internal chronotype with real-world solar time. Visualize: It translates the invisible state of the user's focus into the Chrono-Pulse—a generative 3D shape that thins or vibrates based on cognitive load and presence. Manipulate: It serves as a control center for Zeitgeber interventions, such as the "Light Anchor," which adjusts environment luminance to a specific Kelvin value to reset the brain's clock
How we built it
Aeon Sync was a collaborative effort between human intuition and generative intelligence: Research & Ideation: We used NotebookLLM and Gemini as our primary research partners. While the core concept of a Bio-Mirror was purely ours, these AI tools were instrumental in structuring our research on the human senses and navigating the complexities of circadian phase-response curves. Prompt Engineering: We utilized AI to structure advanced Figma Make prompts. This allowed us to iterate through dozens of biomorphic UI layouts quickly, ensuring the "Next Frontier" aesthetic was both professional and speculative. Prototyping: We built the final interface in Figma Make and also used Figma design to sturcture the UX research.
Challenges we ran into
The greatest challenge was translating a deeply abstract struggle into a functional speculative tool: Empathetic Engineering: Moving beyond our own experience to understand how different neurodivergent users perceive time was critical. We had to design a system that was supportive without being overstimulating. Structuring the Unseen: Defining a visual language for a sense that has no physical form required radical experimentation. We had to determine how to "show" a fading social battery or a drifting rhythm using only light, color, and motion.
Accomplishments that we're proud of
The Bio-Mirror Framework: We are incredibly proud of developing a UI that doesn't just display data but "mirrors" it. Creating a system where the interface's visual weight and clarity directly correlate to the user's mental presence was a major design breakthrough. Effective AI Integration: We successfully utilized Figma Make to iterate on complex biomorphic shapes that would have traditionally taken weeks to model. This allowed us to focus our energy on the high-level empathy and scientific logic of the project. Neuro-Inclusive Design: We transformed a personal struggle with neurodivergence into a universal design solution. Seeing a theoretical concept like Chronoception turn into a clickable, intuitive prototype was our most rewarding milestone.
What we learned
Through this project, we dove deep into the world of Chronoception (the subjective perception of time) and its connection to Circadian Biology. We discovered that our "internal clock" is governed by Zeitgebers—environmental cues like light and temperature. I learned that mental fitness is achieved when we minimize the Temporal Delta, or the difference between our internal biological tempo and the external solar tempo. By using environmental interventions, we can manipulate this equation to bring the user back into a state of Circadian Alignment.
What's next for AEON SYNC
The current version of Aeon Sync is a speculative foundation, but the future involves deeper hardware integration: Haptic Wearable Integration: We aim to develop a physical "Tactile Metronome" that translates the screen’s pulses into micro-vibrations, allowing users to "feel" time without looking at a screen. Phase-Response AI: Implementing a machine-learning model that learns a user’s specific Phase-Response Curve over time to provide even more personalized "Metabolic Gating" and light therapy recommendations. Collaborative Syncing: Expanding the tool for teams, allowing colleagues to see each other's "Internal Tempo" to foster empathy and schedule meetings during overlapping "Flow States."
Built With
- circadian-phase-response
- claude
- figma
- figma-design
- figma-make
- gemini
- interoceptive-feedback-loops
- notebook-llm

Log in or sign up for Devpost to join the conversation.