Context7
by Upstash
Context7 is the most popular MCP server for injecting up-to-date, version-specific documentation and code examples directly into AI prompts. Built by Upstash, it solves the critical problem of LLMs generating broken code from outdated training data by pulling the latest official documentation from over 33,000 indexed libraries at the moment of prompting. Context7 uses an Upstash Vector Database with the DiskANN algorithm to perform semantic search across library documentation, returning relevant code snippets and API references filtered by topic. It exposes two MCP tools: resolve-library-id (which converts human-readable library names into Context7-compatible identifiers using LLM-powered search and ranking) and get-library-docs (which retrieves current documentation sections and code examples for a specific library and version). With over 44,000 GitHub stars and 240,000 weekly npm downloads, Context7 is the number one MCP server by adoption. Integration is seamless across all major AI code editors including Cursor, Claude Desktop, Windsurf, VS Code, and Claude Code. Users simply add 'use context7' to any prompt to activate real-time documentation injection, eliminating hallucinated APIs and outdated code generation.
Installation
Key Features
- ✓Injects up-to-date, version-specific documentation and code examples directly into LLM context windows from over 33,000 indexed libraries
- ✓Two-step MCP tool workflow: resolve-library-id converts library names to Context7 IDs, then get-library-docs retrieves relevant documentation filtered by topic
- ✓Semantic search powered by Upstash Vector Database with DiskANN algorithm for cost-effective, accurate documentation retrieval across all indexed libraries
- ✓Universal AI editor support including Cursor, Claude Desktop, Claude Code, Windsurf, VS Code, and any MCP-compatible client via stdio or HTTP SSE transport
- ✓Simple activation by adding 'use context7' to any prompt, with optional custom rules for automatic invocation on all code-related questions
- ✓Free tier with API key authentication for higher rate limits, plus OAuth 2.0 support for MCP clients implementing the OAuth specification
Use Cases
- →Eliminating hallucinated APIs and outdated code when working with fast-moving frameworks like Next.js, React, Tailwind CSS, and hundreds of other libraries
- →Getting version-specific documentation for legacy projects that cannot immediately upgrade to the latest framework versions
- →Accelerating development by having current code examples and API references injected directly into AI assistant context without manual documentation lookups
- →Ensuring AI-generated code uses correct, existing APIs by grounding LLM responses in official up-to-date library documentation
- →Building documentation-aware AI assistants and coding agents that always reference current library APIs and patterns