Patrick Tobey's Local Memory MCP v1 is a self-hosted memory layer for AI assistants — think Claude Desktop or ChatGPT — that stores conversation history in a local ChromaDB vector database rather than shipping it off to a cloud service. It's open-source, and the pitch is simple: persistent memory across AI sessions, with your data never leaving the machine.

The system uses ChromaDB for vector storage and the all-MiniLM-L6-v2 sentence-transformers model for embeddings, both of which run locally after an initial download. No API keys, no external calls. Retrieval is semantic, so when an AI assistant needs to recall something, it surfaces contextually relevant chunks rather than falling back on keyword matching.

Where it gets more interesting is in how the system handles updates. Rather than overwriting old entries, Local Memory MCP maintains versioned chains — each update links back to what came before. Tobey calls the broader design approach "AIX" (AI eXperience), a philosophy built around how language models actually process context rather than how people tend to think about organizing notes. In practice that means plain text chunks, minimal metadata (timestamps and confidence scores), and a conflict reconciliation engine that catches when a new write overlaps with existing memory. When a conflict is detected, the system returns structured warnings and self-heal hints back to the model instead of silently clobbering prior context.

Soft-delete is the default behaviour — deprecated chunks are hidden from normal search results but retained in history and recoverable. The full set of MCP tools exposed to assistants covers store, search, update, delete, get_chunk, and get_evolution_chain.

Deployment is either Docker Compose or a direct Python 3.11+ install. The MCP server supports stdio transport for local desktop workflows and HTTP SSE transport for remote setups, with bearer or OAuth authentication available for the latter.

Tobey is framing v1 as an early community release aimed at technically capable users willing to manage their own infrastructure. He's actively seeking feedback on the AIX philosophy, the reconciliation heuristics, and the tradeoffs between local embeddings and cloud API alternatives. As MCP gains traction across the AI assistant ecosystem, Local Memory MCP sits within a growing cluster of local-first tooling built around a straightforward question: who actually owns the context that makes these assistants useful?