The top MCP servers to install in 2026
We surveyed the ecosystem: Supabase, Context7, PostHog, Playwright, GitHub, Linear, Sentry, and more. Here's what each one does, when you'd install it, and the one-click config for MCPBolt.
The MCP ecosystem is large enough now that “which servers should I install?” is a real question with a non-obvious answer. Registries like Glama index over 21,000 servers; most of them are experiments, wrappers, or duplicates of official implementations. The signal-to-noise ratio is poor.
This list is curated by category. Each entry is a server with a genuine production use case, maintained by an organization that has reason to keep it working, and available via a stable install path. Where multiple implementations exist, we list the official one.
npx mcpbolt, paste, and select your target tools. MCPBolt handles the format translation for every tool you use.Dev tools
GitHub
GitHub's official MCP server (released April 2025, rewritten in Go) gives your AI assistant access to repositories, issues, pull requests, code search, GitHub Actions workflow runs, and Dependabot alerts. It is arguably the most useful single server you can install: every coding project lives in a GitHub repo, and being able to ask your AI to “open a PR for this branch against main” or “find all open issues tagged bug” without leaving your editor is genuinely time-saving.
GitHub offers both a local server and a remote hosted option at https://api.githubcopilot.com/mcp/ that requires no local installation. For the local version:
{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": { "GITHUB_PERSONAL_ACCESS_TOKEN": "YOUR_TOKEN" }
}
}
}Sentry
Sentry's official MCP server (@sentry/mcp-server) gives your AI direct access to your Sentry projects: issues, error traces, Seer AI analysis, and project configuration. The most useful workflow: paste an error message into Claude or Cursor, have the AI pull the related Sentry issue for full context, then fix the bug. That loop that used to involve three browser tabs and copy-pasting stack traces becomes one command.
{
"mcpServers": {
"sentry": {
"command": "npx",
"args": ["-y", "@sentry/mcp-server"],
"env": { "SENTRY_ACCESS_TOKEN": "YOUR_TOKEN" }
}
}
}Linear
Linear's MCP server is available as a remote endpoint at https://mcp.linear.app/sse. It exposes your issues, projects, cycles, and teams. Useful for any workflow where you want your AI to create tasks from code review notes, triage incoming bugs into Linear, or check what's in the current sprint before suggesting what to work on next.
{
"mcpServers": {
"linear": {
"url": "https://mcp.linear.app/sse",
"headers": { "Authorization": "Bearer YOUR_TOKEN" }
}
}
}Data and databases
Supabase
The Supabase MCP server is the canonical example of what a database MCP server looks like done right. It exposes your Supabase project's tables, lets your AI run queries, manage schema migrations, and inspect data. If you're building anything with Supabase as your backend, this is a near-mandatory install. The AI can help write SQL, debug queries against real schema, and propose migrations without you having to paste table definitions into the context window by hand.
{
"mcpServers": {
"supabase": {
"url": "https://mcp.supabase.com/mcp",
"headers": { "Authorization": "Bearer YOUR_SUPABASE_ACCESS_TOKEN" }
}
}
}PostHog
PostHog's MCP server (@posthog/mcp-server, also available at https://mcp.posthog.com/mcp) exposes your product analytics to your AI assistant. You can ask it to pull funnel data, query events, manage feature flags, and set up A/B experiments. The practical use case: you're implementing a feature and want to immediately instrument it correctly. Instead of jumping between your IDE and the PostHog dashboard, the AI can check what events already exist, propose the right event names, and verify the implementation against your existing schema.
{
"mcpServers": {
"posthog": {
"url": "https://mcp.posthog.com/mcp",
"headers": { "Authorization": "Bearer YOUR_POSTHOG_TOKEN" }
}
}
}Filesystem
The official Anthropic filesystem server from @modelcontextprotocol/server-filesystem gives your AI read and write access to a specified directory. It's the simplest server on this list and one of the most useful: it lets Claude Desktop (which does not have built-in filesystem access by default) actually read and write files. You specify which directories to expose at launch, so it's safe to give access to your projects folder without worrying about the AI touching system files.
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/Users/you/projects"]
}
}
}Web and automation
Playwright
The Playwright MCP server (@playwright/mcp) is the most capable browser automation server available. Unlike screenshot-based browser tools, Playwright MCP operates through the accessibility tree: it reads page structure as structured data, clicks elements by accessible name, fills forms, and navigates. This makes it dramatically more reliable than pixel-clicking approaches, and it works without a vision model. Install this when you want your AI to interact with web applications: scraping, testing, form automation, UI verification.
{
"mcpServers": {
"playwright": {
"command": "npx",
"args": ["-y", "@playwright/mcp"]
}
}
}Fetch
The official Fetch server from @modelcontextprotocol/server-fetch lets your AI retrieve arbitrary web content and convert it to clean markdown. Less powerful than Playwright (no interaction, just retrieval) but much lighter weight. Useful when you want the AI to pull current documentation from a URL without needing a full browser instance. Think of it as a programmable curl | lynx available inside your AI session.
{
"mcpServers": {
"fetch": {
"command": "uvx",
"args": ["mcp-server-fetch"]
}
}
}Knowledge and productivity
Context7
Context7, built by Upstash (@upstash/context7-mcp), solves one of the most frustrating AI coding failure modes: the AI confidently writes code against a library API that changed six months ago. Context7 pulls the current, version-specific documentation for any library and injects it into the context before the AI generates code. No more hallucinated APIs, no more deprecated patterns. You add use context7 to your prompt and it fetches the right docs automatically.
{
"mcpServers": {
"context7": {
"command": "npx",
"args": ["-y", "@upstash/context7-mcp"]
}
}
}Memory
The official Memory server from @modelcontextprotocol/server-memory implements a knowledge graph that persists across sessions. Your AI can store facts, link entities, and retrieve them in future conversations. This is the basic infrastructure for making an AI assistant that remembers things across sessions: your preferences, project context, decisions you've made, people you've mentioned. Not magic, but the foundation for building something close to it.
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"]
}
}
}Brave Search
The Brave Search server gives your AI real web search access via the Brave Search API. Unlike retrieval-augmented generation approaches that embed your own documents, this lets the AI search the open web. Useful for anything that requires current information: checking whether a package has a breaking change, looking up a Stack Overflow answer, finding the current API endpoint for a service. Requires a Brave Search API key (free tier available).
{
"mcpServers": {
"brave-search": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-brave-search"],
"env": { "BRAVE_API_KEY": "YOUR_KEY" }
}
}
}To install any of these with MCPBolt, copy the config snippet, run npx mcpbolt, and paste. MCPBolt writes the correct format for every AI tool you select. See the quickstart for a step-by-step walkthrough.