Which AI Tools Actually Support MCP Well Right Now (May 2026)
Every Model Context Protocol server on the internet is, at the end of the day, a URL. The hard question is which AI tool you're going to plug it into — and the honest answer is that the experience varies wildly depending on which app you live in.
I run MCPBundles, so I see what users actually do after they generate an MCP URL. A lot of them sign up, get the URL, then bounce because the next step — wiring it into the tool they actually use — is unfamiliar territory. Sometimes that's our fault for not making it obvious. Sometimes the tool's setup flow is genuinely awkward. And sometimes the tool literally hides MCP behind a developer toggle that nobody told you to flip.
This is the field report I'd write a friend who asked me, today, "which AI tool should I use if I want MCP to actually work?" Frank, opinionated, with the quirks named.

The smooth ones
These are the AI tools where adding a remote MCP server takes under a minute, OAuth runs automatically, and you don't have to read documentation to find a hidden setting.
Cursor — A
Cursor was one of the first to ship native HTTP MCP and it shows. There's a one-click install button on every public bundle page that opens Cursor via a cursor:// deeplink with the URL pre-filled. You confirm, sign in, done. If you'd rather hand-edit, the config lives at ~/.cursor/mcp.json in the standard mcpServers block:
{
"mcpServers": {
"your-server": {
"url": "https://your-mcp-url.example.com/mcp"
}
}
}
Cursor handles OAuth on the URL form natively. Of every tool in this list, Cursor has the lowest activation energy.
VS Code — A
VS Code's MCP: Add Server command palette flow is the closest thing to Cursor's. There's a vscode:// deeplink, an Insiders variant, and the same straightforward config shape — just a different file (servers instead of mcpServers) and a different path:
~/Library/Application Support/Code/User/mcp.json # macOS
%APPDATA%\Code\User\mcp.json # Windows
~/.config/Code/User/mcp.json # Linux
{
"servers": {
"your-server": {
"type": "http",
"url": "https://your-mcp-url.example.com/mcp"
}
}
}
The type: "http" field is required for VS Code; the other tools infer it.
Claude Code (CLI) — A
Claude Code has the cleanest CLI ergonomics of any host. One terminal command:
claude mcp add --transport http your-server https://your-mcp-url.example.com/mcp
Then, inside a session, /mcp runs OAuth interactively. For headless or CI use, swap OAuth for an API key with --header "X-API-Key: ...". That's the entire mental model.
Claude Desktop — A (as of May 2026)
The story here changed materially this spring. Claude Desktop now supports remote MCP servers as Custom Connectors with native OAuth. The path is Settings → Connectors → + → Add custom connector, paste the URL, click Connect. Credentials rotate without reinstalling and the same URL works on Claude.ai web.
The older .mcpb one-click installer (a bundled archive that double-clicks to install with a pasted API key) is still supported as a fallback for older builds. New installs should default to the URL path.
Claude.ai (web) — A, with one caveat
Same Custom Connector model as Claude Desktop, plus a claude.ai/install-mcp deeplink that prefills the dialog. OAuth handles auth.
The caveat: custom connectors require a paid Claude plan — Pro, Max, Team, or Enterprise. Free Claude.ai accounts cannot add custom MCP servers. Not a deal-breaker, but worth knowing before you tell a teammate to try one.
The quirky ones
These tools support MCP but with at least one wrinkle that costs people an evening if they don't know about it.
ChatGPT — B (the discovery problem)
ChatGPT supports remote MCP servers fine. Once it's wired up it works. The catch is finding the setting in the first place: you have to enable a hidden Developer Mode toggle.
The current path (after OpenAI renamed "Connectors" to "Apps" in December 2025):
- Settings → Apps & Connectors → Advanced settings
- Enable Developer mode
- Back in Apps & Connectors, click Create (the button only appears once Developer Mode is on)
- Paste your MCP URL, choose OAuth
Available in beta on Plus, Pro, Business, Enterprise, and Education plans on the web. Not available on free ChatGPT.
If you don't know Developer Mode exists, you'll spend ten minutes hunting for the "Add MCP server" button that isn't there. The functionality is solid; the discoverability is the problem.
Codex — was B, just became A
For most of 2025, Codex remote MCP support required setting experimental_use_rmcp_client = true in ~/.codex/config.toml before remote HTTP servers worked. Every blog post from that period (including, until last week, our own setup guide) tells you to add that flag.
That flag was completely removed from Codex on December 20, 2025 in PR #8087. The Rust MCP client is now the default. If you're seeing docs that still mention experimental_use_rmcp_client or [features].rmcp_client, they're stale.
The current minimal Codex MCP config is just:
[mcp_servers.your_server]
url = "https://your-mcp-url.example.com/mcp"
Then either run codex mcp login your_server for OAuth or pass http_headers = { "X-API-Key" = "..." } for headless. The CLI command codex mcp add your-server --url <url> does the same thing for you.
Gemini CLI / Gemini Code — was B, also just became A
Same story. Until late 2025, Gemini's MCP config split between an httpUrl field for HTTP transport and a url field for SSE — and remote servers often needed an npx mcp-remote subprocess wrapper. PR #13762 (merged early December 2025) consolidated remote config to a single url field and added auto-detection of HTTP vs SSE transport. The npx mcp-remote shim is no longer required.
Current ~/.gemini/settings.json:
{
"mcpServers": {
"your-server": {
"url": "https://your-mcp-url.example.com/mcp"
}
}
}
Add "type": "http" if you want to skip the auto-detection probe. OAuth is supported.
Windsurf — B
Works fine, but Windsurf's wire format is the odd one out. Where everyone else uses url, Windsurf uses serverUrl, and you have to add type: "streamable-http" explicitly:
{
"mcpServers": {
"your-server": {
"type": "streamable-http",
"serverUrl": "https://your-mcp-url.example.com/mcp",
"headers": { "Authorization": "Bearer YOUR_KEY" }
}
}
}
Config lives at ~/.codeium/windsurf/mcp_config.json. Once you've got the shape right it's stable. The friction is purely "I copy-pasted the wrong key name from someone else's config."
Zed — B (header limitation)
Zed supports remote MCP via context_servers in ~/.config/zed/settings.json. The wrinkle is structural: Zed's URL form does not accept HTTP headers. That means anything requiring an Authorization: Bearer ... header — which is most production MCP servers — has to use the command form with npx mcp-remote as a header-injecting shim:
{
"context_servers": {
"your-server": {
"command": "npx",
"args": [
"-y", "mcp-remote",
"https://your-mcp-url.example.com/mcp",
"--header", "Authorization: Bearer YOUR_KEY"
]
}
}
}
Annoying, but a real Zed limitation rather than a documentation gap. If your MCP server uses pure OAuth without static headers, you can skip the shim.
LM Studio (v0.3.17+), Cherry Studio, Goose — B
All three support mcpServers JSON config plus a custom protocol deeplink (lmstudio://, cherrystudio://, goose://) for one-click installation from a web page. Solid implementations, no major surprises. The only reason these aren't in the A tier is that the wider ecosystem hasn't standardised behind them the way it has behind Cursor, VS Code, and Claude.
Raycast — B
HTTP MCP supported. Setup is form-based inside the app: Name + Transport: HTTP + URL. Quietly competent.
The ones to watch
v0 by Vercel, Devin, Factory droid — newer, fine
All three shipped MCP support late 2025 / early 2026 with OAuth. Custom integration / connector flows that mirror the patterns above. No specific rough edges I've hit; just less time in the wild than Cursor or Claude.
Patterns worth knowing
After watching every host above evolve over the past year, four patterns are stable:
1. Every MCP host reads a JSON file. The shape varies, the idea doesn't.
The container key differs:
mcpServers— most hosts (Cursor, Claude Desktop, Windsurf, LM Studio, Cherry Studio, Goose, Gemini)servers— VS Codecontext_servers— Zed
The endpoint key differs:
url— most hosts (after the recent Gemini and Codex consolidations)serverUrl— Windsurfcommand + args— Zed (when headers are required), and as anpx mcp-remotefallback elsewhere
But every config is fundamentally {name → endpoint, optional headers}. Once you've configured one host, you can configure all of them.
2. OAuth is now the default; static API keys are the legacy path.
A year ago, plugging an MCP server into an AI tool usually meant pasting an API key into a config file. As of mid-2026, every host except Zed routes through OAuth on first connection and rotates credentials silently. If a host is still asking you to paste a static token, that host is behind.
3. Free tiers rarely include MCP. That's not an accident.
ChatGPT free, Claude.ai free, and Cursor free cannot add custom MCP servers. This isn't going to change soon — vendors are gating MCP support behind paid plans because remote tool execution has real serving cost (every MCP call is at minimum an inbound webhook, a tool call, an outbound API request, and several token-counted round trips through the model). MCP support is becoming a paid-tier feature the same way long context windows did.
4. Experimental flags get yanked the moment they ship for real.
Codex needed experimental_use_rmcp_client = true for most of 2025. That flag is gone. ChatGPT's Developer Mode toggle has the same shape — it's beta today, almost certainly the default tomorrow. The lesson for anyone writing integration documentation: do not anchor your instructions on a feature toggle, because the toggle will be removed before your blog post is six months old. Document the steady state and add a note about the temporary flag.
(This is, in fact, why this post exists. We had experimental_use_rmcp_client = true and npx mcp-remote in our own setup guide as recently as last week. Both were stale. I'm fixing them in the same commit as I publish this.)
So which tool should you use?
If you live in Cursor, VS Code, Claude Desktop, or Claude Code, you're in the easy lane. Pick one, paste the URL, sign in, you're done.
If you live in ChatGPT or Codex, the tooling is solid; the discoverability is not. Bookmark the Developer Mode path; you'll need it again.
If you live in Zed, accept that you'll be using npx mcp-remote as a header-injecting shim until Zed's URL form supports headers.
If you're somewhere else — Windsurf, LM Studio, Cherry Studio, Goose, Raycast, Gemini, v0, Devin, Factory — it'll work. Look at the bundle's setup tab on mcpbundles.com for the exact copy-paste config, because the JSON shapes still differ host to host.
What we ship
MCPBundles generates the same MCP URL once and shows you the exact configuration for every host above on every bundle page. Eighteen tabs, all kept current with whatever the upstream tool actually supports today. If you spot a tab where we're behind reality — like npx mcp-remote for Gemini, or the Codex experimental flag — flag it; we'd rather know than ship stale docs.
The bigger picture: the MCP ecosystem stopped looking like 2024's chaos of "every tool has its own protocol and no two tools agree" sometime in late 2025. The remaining differences between hosts are now mostly cosmetic — different field names, different config file paths, different one-click deeplinks. The core idea (URL + OAuth + JSON) has become boringly universal. That's a good thing. It means the next year of AI tooling is going to be about what you do with the tools, not how you wire them in.
— Tony