Back to Servers

Memory

Official

by Anthropic

The Memory MCP Server is an official Model Context Protocol reference server maintained by Anthropic that gives AI assistants persistent memory across conversations using a local knowledge graph. Without persistent memory, every conversation with an AI assistant starts from zero -- the model has no recollection of prior interactions, user preferences, or established context. The Memory server solves this by maintaining a structured knowledge graph stored as a local JSONL file that the assistant can read from and write to during any session. The knowledge graph is built on three core primitives. Entities are the primary nodes representing people, organizations, projects, concepts, or any other named object. Each entity has a unique name, a type classification, and a collection of observations. Relations are directed, labeled connections between entities stored in active voice (for example, "John_Smith works_at Anthropic"). Observations are discrete, atomic pieces of information attached to entities -- individual facts that can be independently added or removed without affecting other stored knowledge. The server exposes nine MCP tools that provide full CRUD operations over the knowledge graph: create_entities and create_relations for building the graph, add_observations for appending new facts to existing entities, delete_entities, delete_relations, and delete_observations for pruning outdated information, read_graph for retrieving the complete graph structure, search_nodes for querying entities by name, type, or observation content, and open_nodes for retrieving specific entities with their full context and interconnections. Storage uses a simple JSONL file format, making the knowledge base human-readable, version-controllable, and trivially portable. The storage path defaults to memory.jsonl in the working directory but can be customized via the MEMORY_FILE_PATH environment variable. Installation is straightforward through npx or Docker, and the server integrates natively with Claude Desktop, Claude Code, VS Code, Cursor, and other MCP-compatible clients. With over 44,000 weekly npm downloads and backing from the 80,000-star modelcontextprotocol/servers repository, the Memory server has become one of the most widely adopted MCP servers in the ecosystem.

productivityknowledge-graphpersistent-memorymcpai-memorymodel-context-protocolentitiesrelationsclaude-desktopofficial

Installation

npx @modelcontextprotocol/server-memory

Key Features

  • Knowledge graph architecture with three core primitives -- entities, relations, and observations -- for structured, queryable memory storage
  • Nine MCP tools providing full CRUD operations: create_entities, create_relations, add_observations, delete_entities, delete_relations, delete_observations, read_graph, search_nodes, and open_nodes
  • Persistent JSONL file storage that is human-readable, version-controllable with Git, and portable across machines and environments
  • Configurable storage path via the MEMORY_FILE_PATH environment variable, allowing multiple isolated memory files for different projects or contexts
  • Flexible installation through npx for zero-config local usage or Docker with persistent volume mounts for containerized deployments
  • Native integration with Claude Desktop, Claude Code, VS Code, Cursor, and all MCP-compatible AI clients without additional middleware

Use Cases

  • Enabling Claude to remember user preferences, communication style, project details, and personal context across separate conversation sessions
  • Building a persistent project knowledge base where AI assistants track team members, architecture decisions, codebase conventions, and ongoing tasks
  • Creating a personal CRM where the AI remembers contacts, relationships between people, meeting notes, and follow-up commitments
  • Maintaining learning context so an AI tutor remembers what topics a student has covered, areas of difficulty, and preferred explanation styles
  • Tracking research progress by storing entities for papers, authors, key findings, and conceptual relationships between ideas across multiple research sessions

FAQ

Server Stats

GitHub Stars
81,024
Updated
3/14/2026
NPM Package
@modelcontextprotocol/server-memory

Category

Related Resources

Weekly AI Digest