Inspiration

Zen was inspired by a simple but common experience: feeling emotionally overwhelmed or disconnected without knowing exactly why. Many people are not in crisis, but they still struggle with self doubt, identity questions, or a quiet sense of unease. We noticed that while tools for productivity and mental wellness exist, very few are designed to help people calmly understand themselves over time.

We wanted to build something that listens first. A space that helps people reflect, recognize patterns, and grow without pressure or judgment. Zen was born from the belief that identity is not something to fix or optimize, but something to understand and care for.

What it does

Zen is a mindful identity platform that helps people understand, protect, and evolve who they are. Zen has three features.

Users begin by journaling freely about their thoughts, emotions, and experiences through time. An AI agent analyzes this input to identify emotional patterns and recurring themes, forming a private and evolving digital self that reflects the user’s identity. This profile is hidden by default and fully controlled by the user.

When users feel uncertain or overwhelmed, Zen offers AI counselling grounded in their own history. An optional feature, Zen Sama, allows users to engage in reflective conversation with wise mentors inspired by historical figures and philosophers who have faced similar struggles.

Meditation is the core of Zen. Guided by an AI meditation master with calming audio and background music, users are led through sessions that help transform insight into presence and acceptance.

Zen does not diagnose, judge, or define users. It supports identity through reflection, dialogue, and mindfulness.

How we built it

We built Zen as a full‑stack self‑discovery engine that turns a user’s journal into a “digital self” and responds through a council of AI mentors. The backend is FastAPI + LangGraph, with a RAG pipeline that embeds journal entries using Gemini’s text-embedding-004 and stores them in Supabase Postgres with pgvector; semantic search pulls relevant memories (with graceful fallback when embeddings fail) and feeds them into agent responses. The frontend is a Next.js 14 app styled with Tailwind and Framer Motion, offering journaling, counsel, and meditation flows. Together, this creates a personal, context‑aware guidance system that helps users overcome Solomon’s Paradox by reflecting on their own history with supportive, purpose‑built agents.

Challenges we ran into

One major challenge was building a meditation experience that felt truly immersive and interactive. We aimed to implement real time, two way audio sessions for the meditation feature so users could be guided dynamically instead of listening to a static script.

However, we ran into setup and response issues with the OpenAI Realtime API during integration. We spent nearly half a day debugging connection flow and response behavior, and with hackathon time constraints, we ultimately had to scope it down and ship a simpler version to ensure the overall product remained complete and demoable.

Accomplishments that we're proud of

We are proud that Zen is not a collection of disconnected features, but a cohesive system where journaling, counselling, and meditation are connected through a shared understanding of the user.

A major accomplishment is our multi-AI agent pipeline. We built agents that can analyze journal entries, extract identity signals such as emotions, values, and recurring themes, and store them securely to form an evolving user profile. This foundation allows Zen to feel personal while still keeping the user in control.

We are especially proud of our highly personalized meditation experience. Instead of delivering generic scripts, Zen uses retrieval augmented generation to pull relevant context from the user’s own reflections and guide sessions in a way that feels grounded, resonant, and emotionally aligned.

What we learned

One of our biggest takeaways is how powerful multi-AI agent architectures can be when building personalized, human centered products. Instead of relying on a single monolithic model, separating responsibilities across agents allowed us to reason about identity in a more structured and controllable way.

We also learned that personalization becomes significantly more meaningful when AI listens before it speaks. Grounding outputs in retrieved user context through retrieval augmented generation led to responses that felt more authentic and aligned with the user’s lived experience, especially in meditation and counselling.

What's next for Zen

Next, we want to deepen Zen’s personalization while strengthening its privacy first foundation. Identity is deeply personal, so our priority is to give users even more control over how their data is stored, retrieved, and used, including clearer transparency into how their digital self is formed and accessed by AI agents.

We plan to further expand our use of retrieval augmented generation to make Zen’s responses even more grounded in long term user context. This includes richer identity memory, better temporal understanding of emotional patterns, and more precise retrieval to ensure counselling and meditation remain authentic and aligned with the user’s lived experience.

For meditation, we aim to revisit our original vision of a more interactive audio guided experience. This includes real time or semi real time sessions that adapt dynamically to user input, emotional state, and pacing, moving beyond static guidance toward a more responsive and immersive meditation flow.

We also plan to introduce a broader range of Zen Sama mentors, spanning different cultures, philosophies, and life experiences. This will allow users to explore their identity through diverse perspectives while maintaining Zen’s emphasis on reflection rather than authority.

Built With

Share this project:

Updates