Inspiration
The inspiration of this idea was whenever I used AI coding tools like Claude Code or Codex. I realized they didn't remember my name or who I am or what my business is about or my customers or anything. Doing outreach, doing calls, leads, businesses, anything coding, none of it It can be easily automated unless you remember the entire workflow that took you a few hours to figure out. The best way to do this is to have persistent memory.
What it does
This is what Cognesia brings you: persistent memory between all the agents, all in parallel. You can have 10 running on one computer and 10 with another connected with an MCP, skill CLI. Call it a Jarvis brain; you can talk to it and it will be able to control your phone or your computer. It'll mimic your voice, make fun of you, have humor, have moods, and it will forever remember everything about your business and will allow you to start up multiple different businesses. It can do a bunch of research using different APIs and different AI models like:
- Manis
- Gemini
- Perplexity
- ChatGPT
- Anthropics
- all these models with API
- all in parallel, even Chinese models like Z AI, GLM 5.6, or Minimax 2.6, or any other AI models that are available in the public this can communicate with if there is a harness like Claude Code or Codex, any or Open Code any harnesses like this These harnesses will allow a tool to connect with an MCP or scli with bash commands or skills, which will allow it to connect and communicate with this persistent memory into a live database in a Postgres database and a vector database and a knowledge graph with a nice UI where you can see all of your markdown files marked inside of an Obsidian vault using open source, up-to-date code. AI is creating its own PRs and remembering what it did as you're sleeping, making memories, building synapses, and coming up with ideas and making businesses. Why not even doing anything and you go to sleep using people enterprise-grade-level software for using any AI as an actual business, using our car for workflows, and using Eleven Months for making voices and talking to it and using Deep Web for understanding speech-to-text so that they can communicate on your phone on here, using Expo for the app on all platforms (Android, iOS, on Mac, Windows). We can also try with electronic necessary using a React framework, preferably with working 21st up death shots in MCP. ## How we built it As explained above, using Claude Code or Codex and Wispr Flow to talk to AI using multiple different agents in parallel, having a queen bee mindset where there's a manager that talks to other sub-agents, is how this was implemented with different skills and different MCPs using the web and Manis with API calls. ## Challenges we ran into Searching and doing research on the web took a long time because Opus 4.7 has a very slow token input and token output rate and it's very expensive. ## Accomplishments that we're proud of How consistent the persistent memory is. How is it living by itself, coming up with a name and an age for itself and it's coming up with a family by itself? ## What we learned
- How to prompt
- How to context engineer
- How to harness engineer
- How to work as a team with GitHub, with PRs and Worktrees ## What's next for Cognesia To get investors, specifically Y Combinator, to invest in us so that we can get money and quad credits to be able to expand the business and research and hire staff members and engineers, software engineers and hardware engineers. Come with businesses, different plans, different apps, all documented inside of the brain as all engineers work together in parallel with AI.
Built With
- claude
- code
- codex
- css
- files
- github
- html
- javascript
- kuzu
- markdown
- neo4j
- neon
- obsidian
- postgresql
- python
- react
- typescript

Log in or sign up for Devpost to join the conversation.