Context7: Up-to-date code documentation for LLMs
MCP server delivering version-specific library documentation to LLMs.
Learn more about Context7
Context7 is an MCP (Model Context Protocol) server that provides version-specific library documentation to Large Language Models during code generation workflows. It automatically detects dependencies listed in a project's package.json file, retrieves documentation corresponding to the exact installed versions, and injects this contextual information into the LLM's working context. This eliminates the common problem of AI models generating code based on outdated or incorrect API signatures by ensuring documentation matches the actual library versions in use. The server integrates with AI-powered code editors like Cursor and Windsurf, operating as a background service that intercepts LLM requests and enriches them with current, accurate library reference material before code suggestions are generated.

Version-specific documentation
Retrieves documentation tied to specific library versions rather than generic information, reducing mismatches between generated code and actual API availability.
MCP protocol integration
Operates as an MCP server compatible with multiple clients including Cursor, VS Code, Claude Code, and Windsurf, allowing seamless integration into existing development workflows.
Direct prompt injection
Places fetched documentation directly into the LLM context window without requiring manual tab-switching or copy-pasting, keeping documentation lookup within the coding interface.
import { Context7Client } from '@upstash/context7-mcp';
const client = new Context7Client({
apiKey: process.env.CONTEXT7_API_KEY
});
// Get docs for a specific library version
const docs = await client.getLibraryDocs('react', '18.2.0');
console.log(docs.content);Adds skills suggest command that scans project dependencies and recommends relevant skills
- –Add skills suggest command that scans your project's dependencies (package.json, requirements.txt, pyproject.toml) and recommends relevant skills
- –Results show install counts, trust scores, and which dependency each skill matches
Improves skill search with install counts, trust scores, and better user experience
- –Add "Installs" and "Trust(0-10)" columns to skill search results with aligned column headers
- –Auto-login via OAuth when the generate command requires authentication instead of showing an error
- –Reorder question options so the recommended choice always appears first with a "✓ Recommended" badge
- –Add "View skill" action that opens generated content in the user's default editor
Shows exact install counts and adds CLI telemetry for usage metrics
- –Show exact install counts instead of rounded values, sort skills by install count in the install command
- –Add CLI telemetry for usage metrics collection (commands, searches, installs, generation feedback) via fire-and-forget events
See how people are using Context7
Related Repositories
Discover similar tools and frameworks used by developers
Unsloth
Memory-efficient Python library for accelerated LLM training.
DeepSpeed
PyTorch library for training billion-parameter models efficiently.
OpenAI Python
Type-safe Python client for OpenAI's REST API.
MLX
Lazy-evaluated NumPy-like arrays optimized for Apple silicon.
llama.cpp
Quantized LLM inference with hardware-accelerated CPU/GPU backends.