Inspiration

Most AI tools today are powerful, but they are impersonal and forgetful and require sending your most private thoughts to the cloud.

I wanted a personal AI that actually grows with me. One that remembers my ideas, understands my context over time, and helps me think, not just respond. Most importantly, I wanted that intelligence to stay on my device, not on someone else’s servers.

I set out to build a futuristic assistant like a tiny Jarvis that runs entirely on mobile hardware. That idea became MemoRoo, a private on-device AI memory layer designed to feel less like a tool and more like a true companion.

What it does

MemoRoo transforms messy, unstructured inputs like text, images, voice notes, and PDFs into a structured, living memory graph.

You can:

Spread ideas across an infinite spatial canvas

Explore memories as glowing planets in a 3D stellar map

Chat with an on-device AI that understands your personal context

Track moods, events, habits, and life moments in a Life OS timeline

All AI processing happens fully on-device to ensure speed, privacy, and offline availability.

How we built it

The frontend is built with React and TypeScript and features a neon cyber-glass aesthetic powered by Tailwind. The backend uses FastAPI and PostgreSQL with a clean architecture that keeps AI components modular and maintainable.

We integrated several on-device AI pipelines:

Quantized LLM via ExecuTorch

TFLite int8 embedding models

OCR and Whisper-tiny transcription

Faiss or Annoy for fast vector search in the RAG pipeline

Custom layout and inference logic for the 3D memory graph

Everything is optimized for Arm CPUs and Mali GPUs, ensuring MemoRoo is fast, lightweight, and fully offline.

Challenges we ran into

Building MemoRoo on-device came with a lot of tough challenges. Running an LLM smoothly on a device with limited memory pushed me to optimize every part of the pipeline. I also had to keep vector search fast while storing hundreds of embeddings and make the 3D memory engine perform well without relying on heavy libraries. Designing storage models that could handle messy, real-world data took careful planning, and getting the spatial canvas interactions to feel natural and fluid required many iterations. On top of all that, I constantly had to balance aesthetics with device performance to make sure MemoRoo felt beautiful but stayed lightweight.

Accomplishments that we're proud of

I’m really proud of what MemoRoo has become. I built a full on-device RAG pipeline that runs entirely privately, and a custom 3D graph engine that feels smooth and cinematic. The Life OS automatically links memories, moods, and events, creating a living map of my life. I designed a clean full-stack architecture that can scale as the system grows, and added UI animations that feel futuristic without compromising performance. At the end of the day, MemoRoo genuinely feels like a small personal operating system, something I can carry with me and rely on every day.

What's next for MemoRoo – Your On-Device AI Memory Layer

Looking ahead, I plan to add local-first syncing across devices so my memories and ideas can move seamlessly with me. I want to support more AI models with dynamic switching to make the system even smarter and more flexible. I’m also expanding the 3D universe with richer physics and layouts to make exploration more immersive. My goal is to release downloadable iOS and Android versions, add deeper personal analytics and long-term memory insights, and explore a plugin system that lets users create custom AI workflows. These improvements will make MemoRoo even more powerful, personal, and engaging while staying fully private and on-device.

Built With

Share this project:

Updates