Inspiration
The human brain is incredible at feeling, but notoriously unreliable at storing "atomic" details over long periods. We noticed two major problems in the digital age:
- Digital Graveyards: Most note-taking apps are where memories go to die—unorganized and never revisited.
- AI Amnesia: Standard chatbots might know a lot about the world, but they don't know you. They forget your sister's name, your favorite coffee, or how your relationship with a friend has evolved over years.
We wanted to build the "Pensieve" from the wizarding world—a place where you can offload your thoughts and have them organized into a meaningful, visual, and evolving network.
What it does
Pensieve transforms fleeting conversations into a structured, living memory network by extracting "atomic facts" and relationships in real-time. Using a sophisticated Gemini 3-powered pipeline, the app identifies core preferences and emotional milestones, organizing them into an interactive Memory Graph that maps your personal world of people, places, and events. Your personalized Pet Buddy serves as the primary interface, utilizing Dual-Retrieval RAG to maintain perfect context across years of dialogue—ensuring it never forgets a detail, whether it’s a subtle dietary preference or a major shift in a relationship status. It is more than an archive; it is an evolving companion that learns, remembers, and grows alongside you.
How we built it
Building a "living" memory required more than just a database; it required a sophisticated pipeline to translate human speech into structured data. The Brain (Gemini 3): We utilized Gemini 3 to act as the primary engine for our Memory Extraction Pipeline. It doesn't just summarize; it identifies "Entity Triplets" (Subject → Relation → Object). The Memory Graph: Instead of a flat list, we used a Force-Directed Graph to visualize memories. This allows users to see how a person like "Emma" is connected to specific locations, emotions, and events. Dual-Retrieval RAG: We implemented a custom Memory RAG system. When a user asks a question, the app searches both vector embeddings (for semantic similarity) and the knowledge graph (for relational context). Personalized Companions: To make the tech feel human, we integrated an MBTI-based pet buddy system that acts as the interface between the user and their data.
Challenges we ran into
The "Node Isolation" Problem: Initially, our extraction was too conservative. We’d input pages of dialogue, and the graph would only show two nodes. We had to refine our prompts to encourage "Atomic Extraction," ensuring even small details (like a favorite food) became independent, linkable entities. Entity Resolution: If a user mentions "My partner" in one chat and "Sarah" in another, the AI needs to realize they are the same person. Mapping these evolving relationships in real-time without creating duplicate "ghost" nodes was a significant logic puzzle. Balancing Privacy and Utility: Since these are deeply personal memories, we had to ensure the architecture was siloed by user accounts while still allowing the AI to be proactive and helpful.
Accomplishments that we're proud of
We take immense pride in successfully bridging the gap between raw data and human sentiment by engineering a high-fidelity Memory Extraction pipeline that distills complex conversations into structured, atomic facts. A major breakthrough was our Dual-Retrieval RAG system, which effectively "cures" AI amnesia by allowing the Pet Buddy to recall deeply buried context with perfect precision. Most importantly, we’ve built an interactive Memory Graph that isn't just a technical feat, but a living map of the human experience—one that evolves in real-time as relationships grow, ensuring that every user’s digital companion feels like a genuine witness to their life's journey.
What we learned
This project taught us that the future of AI isn't just about "intelligence," it's about "context." > "An AI that knows everything about the world is a tool; an AI that knows your world is a companion." We learned that users don't just want a search engine for their life, they want a system that understands growth. Seeing a node change from "Girlfriend" to "Fiancée" isn't just a data update; it’s a milestone. We learned how to bridge the gap between cold data structures and warm human experiences.
What's next for Pensieve AI companion App
The next chapter for Pensieve focuses on transforming it from a reactive tool into a proactive, multimodal life companion. We are expanding its senses to "see" and "hear" through photo and voice integration, allowing for a truly holistic memory capture that includes visual and sensory context. Our roadmap features predictive intelligence that anticipates your needs before you ask, shared memory nodes for collaborative storytelling with loved ones, and deep emotional wellness insights to track your long-term mental growth. By weaving into your broader IoT ecosystem, Pensieve will evolve into an intuitive guardian that doesn't just store your past, but actively enriches and guides your future.
Log in or sign up for Devpost to join the conversation.