Inspiration

Digital environments increasingly shape how we think, focus, and feel throughout the day. Notifications, algorithmic feeds, and constant task switching can fragment attention in ways that are difficult to notice in real time.

The inspiration behind NAU was the idea that people currently have no clear way to see how their digital environment is affecting their cognition. While we can track steps, sleep, and heart rate, we lack tools that help us understand our mental alignment and cognitive readiness.

NAU explores what a product might look like if people could visualize their cognitive state and receive guidance to help protect their focus.

What it does

NAU helps users understand and manage their cognitive state in real time.

Using signals captured from wearable sensors, the system maps several cognitive indicators including focus stability, mental load, emotional energy, digital strain, and algorithmic influence.

These signals combine to produce an Alignment Score, which represents how mentally ready a user is to take on new tasks.

The app then translates those signals into a simple interface that shows:

  • Right Now: what is currently influencing the user’s cognitive state
  • Suggestions: AI-generated recommendations to stabilize attention
  • Cognitive Signals: deeper insights into the drivers of alignment

Users can also view Your Day in Flow, a timeline that shows how alignment changes throughout the day, and access Cognitive Tools that help protect attention.

Over time, NAU builds a Mind Profile that tracks milestones, alignment streaks, and behavioral patterns so users can view their progress toward improving cognitive alignment.

How we built it

We approached NAU as a product design exploration, starting with the core user journey.

First, we mapped the high-level experience flow: sensor setup, brain scan, alignment dashboard, daily cognitive timeline, cognitive tools, and the long-term mind profile.

From there, we drafted low-fidelity wireframes using UXPilot to quickly test the structure of the experience and the main interaction patterns. This allowed us to iterate on the layout and screen hierarchy before committing to detailed design work.

Once the overall flow felt right, we moved into Figma Make to build the higher-fidelity prototype. Figma Make allowed us to design the finer interaction details, refine the visual language, and prototype the brain scan visualization and dynamic UI behaviors.

Challenges we ran into

One of the biggest challenges was originating a concept that would be truly unique. We wanted NAU to explore a future category of products centered on cognitive awareness, rather than simply extending existing wellness or productivity tools.

Another challenge was achieving the level of detail we envisioned within the constraints of the platform. With roughly 6,000 cumulative credits, we had to carefully prioritize which parts of the product experience to fully prototype and refine.

We also encountered challenges with collaboration and file management in Figma Make. Transferring designs between Figma Make files proved difficult, which made it harder to merge workstreams and iterate collaboratively on shared screens.

Despite these constraints, we were able to build a cohesive prototype that demonstrates the core product concept and interaction flow.

Accomplishments that we're proud of

We’re very proud of developing a complete end-to-end product experience that we believe could be genuinely beneficial to society in an increasingly overwhelming technological world. As digital environments continue to shape how we think, focus, and interact, we wanted to explore what it might look like to design tools that help people better understand and protect their cognitive wellbeing. NAU represents an attempt to reframe technology not as something that competes for attention, but as something that can actively support clarity, focus, and healthier digital habits.

We’re also particularly proud of designing a highly detailed brain scan interaction that visually communicates how the system maps cognitive signals in real time. The interactive 3D brain visualization became the centerpiece of the experience, showing neurons firing and progressively illuminating different cognitive signals as the scan runs. Developing this interaction required experimenting with multiple approaches and pushing the prototyping tools to their limits in order to create something that felt dynamic, believable, and intuitive.

More broadly, we feel we were able to push Figma Make to the maximum amount possible, especially when it came to building complex interactions and visual behaviors around the brain visualization. Achieving a convincing scanning experience — where signals propagate across the brain and the interface responds to the user’s input — required careful iteration and creative use of the available tools.

What we learned

The biggest takeaway from this project was seeing firsthand how AI-assisted design tools are changing the way products are built. Tools like Figma Make allowed us to move from concept to interactive prototype much faster than traditional workflows, letting us iterate on ideas, interactions, and visual systems in real time.

Instead of treating design as a strictly linear process, we were able to work in a more fluid loop between idea, prompt, prototype, and refinement. This made it possible to explore ambitious interaction concepts, like the brain scan visualization, while continuously improving the user experience.

Overall, this project reinforced our belief that AI will fundamentally reshape product development workflows. Designers and product teams will increasingly spend less time building static artifacts and more time guiding intelligent systems to rapidly generate and refine experiences.

What's next for NAU

The next step for NAU would be expanding the intelligence behind the system and deepening how it learns from users over time. While the current prototype demonstrates how cognitive signals could be visualized and translated into simple insights, future versions could focus on building a more advanced AI insights engine that continuously learns from behavioral patterns and environmental inputs.

Over time, NAU could become better at anticipating shifts in cognitive state and recommending interventions before attention begins to degrade. This might include deeper integrations with digital environments, productivity tools, and operating systems to help manage notifications, digital inputs, and task switching more intelligently.

Another important direction would be improving the sensing layer itself. As wearable and neural sensing technologies evolve, NAU could incorporate additional signals to refine the accuracy of its cognitive models and provide more personalized insights.

Built With

  • claude
  • figjam
  • figma
  • uxpilot
Share this project:

Updates