Skip to main content

Letta Teardown — Agent Memory Framework from MemGPT Researchers ($10M Seed, Felicis-Backed)

Copyable to YOU

Sign in with Google to see your personal Copyable Score - a 5-dimension breakdown of how likely you (with your budget, tech stack, channels, network, and timing) can replicate this product.

Letta Teardown — Agent Memory Framework from MemGPT Researchers ($10M Seed, Felicis-Backed)

TL;DR

Letta is the commercial productization of MemGPT — a 2023 UC Berkeley research paper that argued LLM agents shouldn't rely on context windows for memory, they should have an operating-system-style memory hierarchy with paging between fast in-context blocks and slower archival storage. The paper hit 1,000+ citations in under a year. The three authors (Charles Packer, Vivian Fang, Sarah Wooders) spun out, raised $10M seed from Felicis September 2024, and shipped Letta as both open-source framework (Apache 2.0, 16K+ GitHub stars) and hosted cloud SaaS.

Product is narrow but deep: gives developers a way to build stateful AI agents that remember conversations, learned preferences, accumulated context across sessions without manually stuffing context windows. Memory stored as structured blocks in Postgres, agent's "OS" decides what to load into LLM context per turn, framework handles eviction, summarization, retrieval. Estimated ARR $2-5M based on Cloud tier pricing — closer to early commercialization than scale, most usage still self-hosted OSS.

Why this teardown matters: agent memory is becoming recognized infrastructure layer in AI stack — same way vector databases became a layer in 2022-2023. Mem0, LangMem (LangChain's memory product), Zep, Letta all racing for the wedge. Letta's edge is research credibility + Felicis backing; weakness is "memory" as standalone product is hard to monetize when LangChain, OpenAI Assistants API, Anthropic's MCP can all add memory as feature. Indie hacker replicate playbook isn't "build another general memory framework" — it's "build memory infrastructure for one specific vertical agent type" where Letta's general-purpose abstraction is overkill.

Quick Facts

Field Value
Domain letta.com (memgpt.ai redirects)
Launched May 2024 (Letta brand); MemGPT paper Oct 2023
Founders Charles Packer (CEO), Sarah Wooders (CTO), Vivian Fang
Origin UC Berkeley Sky Computing Lab spinout
Funding $10M seed September 2024
Lead Felicis Ventures (Aydin Senkut)
Angels Jeff Dean (Google), Clem Delangue (HuggingFace)
ARR $2-5M (Cloud tier, early commercial)
GitHub 16K+ Letta + 12K+ legacy MemGPT
License Apache 2.0
Pricing Free OSS / Cloud Free 5K msg / Pro $99/mo / Enterprise custom
Stack Python, FastAPI, Postgres, SQLAlchemy, Docker
Direct competitors Mem0, LangMem (LangChain), Zep, OpenAI Assistants memory

5-Minute Walkthrough — Install SDK and Create a Stateful Agent

Step 1: Install. pip install letta or docker run letta/letta. Docker is what most users take because Python install requires you to provide Postgres.

Step 2: Start server. letta server start boots FastAPI on port 8283. Admin UI at localhost:8283 — functional but not polished, looks like internal tool. Engineer-first team.

Step 3: Create agent.

from letta import create_client
client = create_client()
agent = client.create_agent(
    name="support_bot",
    persona="You help customers with billing questions.",
    human="The user is a paying customer.",
    llm_config={"model": "gpt-4o"},
)

Under the hood: Letta creates Postgres row for agent, allocates four default memory blocks (persona, human, archival_memory, recall_memory), and registers memory tools the agent can call to read/write its own memory. Core MemGPT idea — agent has tools to manipulate its own memory, the same way a program has syscalls to manipulate its own address space.

Step 4: Have conversatio

Sign in to read this report

You have read your 1 free report. Sign in with Google to unlock 2 more.

Sign in with Google