The AI memory space has gotten crowded fast.
In the last year: Mem0 raised $24M. Letta raised $10M. Supermemory raised $3M. A new “AI memory” repo hits GitHub trending almost every week, usually built by someone over a weekend. And every major AI platform — Claude, ChatGPT, Gemini — has shipped some version of built-in memory.
Everyone’s building memory. The question is: what kind, for whom, and can you actually read it?
The quick overview
| Basic Memory | Mem0 | Letta | Supermemory | Weekend projects | |
|---|---|---|---|---|---|
| What it is | Knowledge base for you | Memory API for developers | Agent framework | Memory infra for developers | Varies |
| You can read what’s stored | Yes — it’s just files | No | Partially | No (it’s a backend service) | Rarely |
| Open source | AGPL | Partial | Apache 2.0 | No | Usually |
| Funding | Bootstrapped, profitable | $24M VC | $10M VC | $3M VC | $0 |
| Cloud option | $19/mo | Enterprise pricing | Enterprise pricing | Pay-per-query API | N/A |
| Primary interface | Files + knowledge graph | API | Agent framework | API | Varies |
Basic Memory: your memory, in files you own
Basic Memory stores everything as plain text files. When your AI writes a memory, it creates a note you can open in any text editor. Notes connect to each other through semantic links, forming a knowledge graph that grows over time.
What we do well:
- Transparency. You can read, edit, and delete anything your AI “knows” about you. Open a folder, read the files.
- Bidirectional. You and your AI read and write the same files. You can edit a note and your AI will see it. Your AI writes a note and you can read it.
- Portable. It’s plain text. If Basic Memory disappeared tomorrow, you’d still have useful, readable files. Try that with an API. Where we’re honest about limitations:
- We’re not an enterprise platform. No org-wide user management (yet).
- We’re a smaller team than our VC-funded competitors.
Cloud pricing: $19/month for Basic Memory Cloud.
Mem0: the developer memory API
Mem0 has raised $24M and built a popular developer ecosystem. Their approach is a memory API — developers send data in, it stores and retrieves memories, and they integrate it into their apps.
What Mem0 does well:
- Developer ecosystem. Extensive documentation, lots of integrations, a large community.
- Simple API.
mem0.add()andmem0.search()— clean and fast to integrate. - Enterprise features. If you’re building a product that needs managed memory infrastructure, they’re built for it.
The tradeoffs:
- Opaque by design. There’s no user-facing interface to see what’s stored. It’s a black box for developers to build on top of.
- VC dynamics. $24M in funding means Mem0 needs to become a very large business. That shapes product decisions.
- Users of apps built on Mem0 have no visibility into what the app remembers about them.
Letta: the agent framework with memory
Letta (formerly MemGPT) started as a research project exploring how to use an LLM’s context window as a virtual memory system. They’ve since evolved into a full agent framework with persistent state.
What Letta does well:
- Research-driven. They think deeply about how memory should work for agents. Their MemGPT paper was genuinely innovative.
- Full agent framework. If you’re building complex AI agents that need state management, Letta provides serious infrastructure.
The tradeoffs:
- It’s an agent framework, not a memory tool. If you just want persistent memory, it’s a lot of machinery.
- $10M VC-funded. The product roadmap serves investor returns as much as users.
- Still abstracted. Even as they move toward file-based approaches, memory is several layers away from “open a folder and read markdown.”
Supermemory: backend memory infrastructure
Supermemory is a B2B infrastructure play — a memory API that developers embed in their products to power AI features. Think of it like a database service, but for AI memory.
What Supermemory does well:
- Infrastructure focus. Fast, scalable memory retrieval built on Postgres and a vector engine.
- Developer-friendly. Clean SDKs, works with OpenAI, Anthropic, and other providers.
The tradeoffs:
- Not a consumer product. End users of apps built on Supermemory have no way to see, edit, or export what’s stored about them.
- VC-funded ($3M). Same dynamics as the others.
- Closed source. You’re dependent on their infrastructure.
Weekend projects and open-source experiments
Every few weeks something new hits GitHub trending — a weekend build that adds memory to Claude or ChatGPT, usually by storing conversation snippets in a vector database. Some are clever. Most solve a narrow problem.
These are worth watching, but come with real tradeoffs: no ongoing maintenance, no sync, no knowledge graph, and often no way to manage what’s stored once you’ve accumulated it.
The fundamental question
Here’s what it comes down to: can you read what your AI knows about you?
With Basic Memory, the answer is yes — always. Your memory is a folder of plain text files. You can open any of them, edit them, delete them, or take them somewhere else entirely.
With Mem0, Letta, and Supermemory, memory is managed for developers. They interact with it through APIs. End users interact with whatever interface the developer built — and the actual storage is invisible.
Neither philosophy is wrong. But they’re solving different problems for different people.
Mem0 and Supermemory are infrastructure for developers building apps. Letta is infrastructure for developers building agents. Basic Memory is for you — the person who wants to remember things across conversations, own what you know, and actually be able to read it.
Who should use what
Choose Basic Memory if:
- You want to own your memory as readable files
- You use Claude and want MCP-native memory that just works
- You value transparency
- You don’t want to depend on a VC-funded company’s roadmap
Choose Mem0 if:
- You’re a developer building an app that needs memory as a service
- You need enterprise-grade managed infrastructure
- Opacity is fine — your users don’t need to see what’s stored
Choose Letta if:
- You’re building complex AI agents that need state management
- You want a full agent framework, not just memory
- You’re doing serious research on memory architectures
Choose Supermemory if:
- You’re a developer who wants fast, scalable memory infrastructure as a backend service
- You’re building at enterprise scale and cost efficiency matters
Watch the weekend projects if:
- You’re curious and want to experiment
- You’re a developer who wants to understand how memory systems work under the hood
Our perspective
We’re not trying to be Mem0 or Letta. We’re not building infrastructure for other developers to build on (though we might consider expanding in that direction soon). We’re building for the person who wants their AI to remember things — and who believes they should be able to read, edit, and own what it remembers.
The memory space is big enough for all of these approaches. But we think the future of personal AI memory is transparent. Plain text has outlasted every proprietary format in computing history. It’ll outlast whatever comes next too.