Memwright gives AI agents persistent, ranked memory that stays out of the context window. No Docker. No API keys. No monthly bill. Just install and go.
One package. Works with Claude Code, Cursor, Windsurf, or any MCP client. Scales from your laptop to AWS, Azure, and GCP.
"Four hours debugging a gnarly database migration, 30 back-and-forth messages about schema evolution. Closed the terminal, came back after dinner — Claude had no idea what we'd figured out."
More memories makes Memwright better — more candidates to rank from — while the context cost stays the same.
poetry add memwright. Two commands. Done. No Docker, no database, no API keys. SQLite + ChromaDB + NetworkX provision automatically.
memory_recall(query, budget=2000) — the only memory system that asks "how much space do you have?" before answering. Month 1 and month 12 cost the same.
Tag matching, graph traversal, vector search, RRF fusion. All algorithmic. Same query = same results. Every time. No GPT calls on every add like Mem0.
"User works at Google" auto-supersedes "User works at Meta." Full history preserved. Zero inference calls. No vector similarity coin-flip.
Namespace isolation, 6 RBAC roles, provenance tracking, write quotas, token budgets. Built for orchestrated pipelines, not bolted on.
Your laptop, AWS App Runner, GCP Cloud Run, Azure Container Apps. PostgreSQL, ArangoDB, or bare SQLite. Same API. Same results.
| Memwright | Mem0 | Zep | Letta | OpenAI | LangChain | |
|---|---|---|---|---|---|---|
| LOCOMO | 81.2% | 66.9% | ~75% | 74% | 52.9% | — |
| Setup | poetry add | API key | Neo4j | Docker+PG | ChatGPT only | Framework lock-in |
| Graph memory | Free, all tiers | $249/mo Pro only | Yes, all tiers | Agent-managed | No | No |
| LLM in retrieval | None (RRF + PageRank) | Yes (every add) | None | Yes (agent calls) | Unknown | Varies |
| Self-host | Yes (zero config) | Yes | Via Graphiti | Docker required | No API access | Yes (OSS) |
| Cost floor | $0 forever | $19/mo | $25/mo | $20/mo | N/A | Free |
| System | P50 | Notes |
|---|---|---|
| Memwright (PG Docker) | 1.4ms | Full 3-layer pipeline, 81.2% LOCOMO |
| Ruflo | 2–3ms | Vector lookup only, not full retrieval |
| Memwright (local) | 9ms | Zero-config, no Docker, no API keys |
| Memwright (GCP Cloud Run) | 156ms | Full cloud API, scale-to-zero |
| Mem0 | 200ms | LLM in retrieval path |
| Zep | <200ms | P95 ~632ms under concurrency |
| Mem0 Graph | 660ms | Graph variant, much slower |
LOCOMO scores are self-reported across vendors. Latency measured with full 3-layer pipeline (tag + graph + vector). Run yours: memwright locomo
MEMORY.md dumps everything into context. Every line. Every message. Memwright stores memories in a separate process — SQLite + ChromaDB + NetworkX, on disk. Your context window never sees a memory until Claude calls memory_recall.
Most memory systems pick a lane. Memwright picks yours.
SQLite + ChromaDB + NetworkX. Zero config. No network. No Docker. Works on macOS, Linux, Windows. Air-gapped friendly.
App Runner with Starlette ASGI. Terraform templates included. Auto-scaling, HTTPS, custom domains. Full API compatibility.
Container Apps with Cosmos DB. Terraform templates included. Scale-to-zero. Same API, same results, Microsoft cloud.
Cloud Run with AlloyDB. Terraform templates included. Scale-to-zero. 156ms P50. Google's managed infrastructure.
Multi-model database for graph + document + vector in one. ArangoDB Oasis on AWS or self-hosted. Native graph traversal.
pgvector + AGE graph extension. 1.4ms P50 recall. Neon serverless or any Postgres. The fastest backend available.
Dockerfile included. Run anywhere containers run. Air-gapped deployments. Full control over your data and infrastructure.
You're already paying for the brain. Memwright gives it a memory — on infrastructure you already own.
Zero config beats configuration.
Degradation beats failure.
History beats deletion.
Math beats LLMs in retrieval.
Layers beat platforms.
Dedup beats bloat.
Your disk beats their cloud.
Free. Open source. Apache 2.0. Built by Surendra Singh — 15 years in financial services technology.