Skip to main content

Snowflake MCP Server: Query, Explore & Manage Your Data Warehouse with AI

· 6 min read
MCPBundles

Snowflake MCP Server

Snowflake doesn't have an official MCP server. For a platform where most users interact through SQL, that's a meaningful gap — every question an analyst has starts with "what tables do we have?" and ends with a query, and that's exactly the workflow MCP is built for.

MCPBundles provides a purpose-built Snowflake toolset that connects your AI agent directly to your Snowflake account through the SQL REST API. Your AI can explore databases, navigate schemas, inspect table structures, execute arbitrary SQL, manage warehouses, and review query history — all authenticated with a Programmatic Access Token.

What Your AI Can Do

Explore Your Data Warehouse

ToolWhat it does
List databasesAll databases accessible to your role
List schemasSchemas within any database — navigate your data hierarchy
List tablesTables in any schema with row counts and size on disk
Describe tableColumn names, types, nullability, defaults, primary/unique keys

Your AI can orient itself in an unfamiliar Snowflake account in seconds: list databases, drill into schemas, find the tables, and describe their structure — all before writing a single query.

Execute SQL

ToolWhat it does
Execute SQLRun any SQL statement — SELECT, INSERT, CREATE, SHOW, CALL, anything

This isn't limited to SELECT. Your AI can run DDL (CREATE TABLE, ALTER), DML (INSERT, UPDATE, DELETE), Snowflake-specific commands (SHOW, DESCRIBE, CALL), and analytics queries. Results come back as structured rows with column metadata.

Override the warehouse, database, schema, or role per query — useful when your AI needs to run something against a different context without changing the session defaults.

Manage & Monitor

ToolWhat it does
List warehousesAll virtual warehouses with state, size, auto-suspend, and auto-resume settings
Query historyRecent queries with execution time, status, rows produced, bytes scanned

Query history is particularly useful for debugging. Your AI can pull the last 100 queries, filter by warehouse, and identify the slow ones — with compilation time, execution time, and error messages all visible.

Real Workflows

"What's in this Snowflake account?"

Your AI lists all databases, picks the relevant one, shows its schemas, and describes the key tables. Full orientation without opening the Snowflake console.

"How many orders did we process last week?"

SQL execution against your warehouse. Your AI writes the query, runs it, and returns the result — with the option to break it down by day, region, or product category if you ask follow-up questions.

"Which queries are running slow?"

Query history filtered by warehouse, sorted by execution time. Your AI identifies the heaviest queries and can suggest optimizations by examining the SQL and table structures.

"Create a staging table for the new data feed"

DDL execution — CREATE TABLE with the right column types, constraints, and clustering keys. Your AI can describe existing tables first to match the schema conventions already in use.

"Compare this month's revenue to last month"

Time travel queries (SELECT ... AT(TIMESTAMP => ...)) or standard date-filtered aggregations. Your AI handles the SQL, and you get the answer in natural language.

How Authentication Works

Snowflake tools use Programmatic Access Tokens (PATs) — a Snowflake-native auth mechanism that doesn't require a full OAuth flow or key-pair setup:

  1. Generate a PAT in your Snowflake account (ALTER USER ADD PROGRAMMATIC ACCESS TOKEN)
  2. Enter the token and your account URL on MCPBundles
  3. Optionally set a default warehouse, database, and role

PATs are scoped to the user's role permissions, so your AI can only access what that user can access. Tokens are configurable from 1 to 90 days before expiry.

Snowflake-Specific Gotchas Your AI Knows

  • Identifiers are uppercase by default — unquoted identifiers are stored as uppercase in Snowflake. The tools handle quoting automatically.
  • Warehouse must be running — queries require an active warehouse. If it's suspended, Snowflake auto-resumes it (adds a few seconds of latency).
  • Semi-structured data — VARIANT, OBJECT, and ARRAY types work with : and [] path notation. Your AI can query JSON-like data natively.
  • Time travel — historical queries with AT(TIMESTAMP => ...) or BEFORE(STATEMENT => ...) for point-in-time analysis.

Setup

  1. Enable the Snowflake bundle on MCPBundles
  2. Add your account URL and Programmatic Access Token
  3. Ask your AI: "List all databases and show me what tables are in each one"

FAQ

Is there an official Snowflake MCP server?

No. Snowflake does not publish an official MCP server. MCPBundles connects to Snowflake through the SQL REST API using Programmatic Access Tokens.

Can my AI modify data in Snowflake?

Yes. The SQL execution tool supports INSERT, UPDATE, DELETE, and DDL operations. Access is governed by the Snowflake role associated with your PAT — if the role can't delete, neither can your AI.

Does this work with Snowpark or Cortex AI?

The SQL execution tool can run any SQL that Snowflake supports, including CALL statements for stored procedures and Snowpark-based UDFs. Cortex AI functions (like SNOWFLAKE.CORTEX.COMPLETE()) work as regular SQL calls.

What about multi-cluster warehouses and auto-scaling?

The list warehouses tool shows cluster state, size, and auto-suspend configuration. Your AI can monitor warehouse utilization and help you decide when to resize or adjust auto-suspend settings.

How is this different from connecting Snowflake to a BI tool?

BI tools give you dashboards. MCP gives your AI direct SQL access so it can answer ad-hoc questions, explore unfamiliar schemas, and run one-off analyses without building a dashboard first. They're complementary — use BI for recurring reports, MCP for the questions you think of on the fly.