/ Directory / Playground / mcp-bigquery-server
● Community ergut ⚡ Instant

mcp-bigquery-server

by ergut · ergut/mcp-bigquery-server

Read-only natural-language BigQuery from Claude — schema exploration, query limits, PII field restrictions — with service account auth.

mcp-bigquery-server is a Node MCP giving LLMs safe, read-only access to BigQuery datasets. Enforces configurable scanned-bytes query limits (default 1GB), supports field-level restrictions for PII/PHI, and can be installed via Smithery or manually configured with service account credentials.

Why use it

Key features

Live Demo

What it looks like in practice

bigquery-server.replay ▶ ready
0/0

Install

Pick your client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Open Claude Desktop → Settings → Developer → Edit Config. Restart after saving.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Cursor uses the same mcpServers schema as Claude Desktop. Project config wins over global.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Click the MCP Servers icon in the Cline sidebar, then "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Same shape as Claude Desktop. Restart Windsurf to pick up changes.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "bigquery-server",
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ]
    }
  ]
}

Continue uses an array of server objects rather than a map.

~/.config/zed/settings.json
{
  "context_servers": {
    "bigquery-server": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "mcp-bigquery-server"
        ]
      }
    }
  }
}

Add to context_servers. Zed hot-reloads on save.

claude mcp add bigquery-server -- npx -y mcp-bigquery-server

One-liner. Verify with claude mcp list. Remove with claude mcp remove.

Use Cases

Real-world ways to use mcp-bigquery-server

Answer product/growth questions from BigQuery without writing SQL

👤 PMs, growth analysts with BQ-backed warehouse ⏱ ~15 min intermediate

When to use: You have a question whose answer lives in events tables in BQ.

Prerequisites
  • GCP service account with BQ Data Viewer + Job User — IAM > Create service account; download JSON key
Flow
  1. Discover tables
    List tables in dataset analytics. Describe events and users.✓ Copied
    → Schemas
  2. Ask the question
    How many users who signed up in March 2026 triggered the 'aha_moment' event within 7 days?✓ Copied
    → Numeric answer with SQL shown
  3. Caveat
    Any caveats? Timezone, deletion, test users?✓ Copied
    → Honest caveats

Outcome: Answers in minutes instead of data-team tickets.

Pitfalls
  • Running SELECT * on a huge fact table blows the scan limit — Always filter by partition column (often _PARTITIONDATE)

Let a less-trusted analyst explore data without reading PII rows

👤 Data platform teams ⏱ ~30 min advanced

When to use: You want to open BQ access to more people via chat without each of them being able to read customer emails.

Flow
  1. Configure restricted fields
    Add config.json entry restricting fields users.email, users.phone, users.ssn. Agent can only aggregate these, not SELECT them raw.✓ Copied
    → Config in place
  2. Test
    Run SELECT email FROM users LIMIT 10. Verify it's blocked. Then run SELECT domain, COUNT(*) FROM users GROUP BY domain — verify it works.✓ Copied
    → Block on raw read; allow on aggregate

Outcome: Safer self-service analytics for the LLM era.

Pitfalls
  • Regex-based field detection can miss complex aliased SQL — Defense in depth — also use BQ column-level security / authorized views
Combine with: gateway

Auto-compile a daily metrics digest from BQ

👤 PMs, founders ⏱ ~30 min intermediate

When to use: You want KPIs in Slack every morning without a BI tool.

Flow
  1. Define the metrics
    Define queries for: DAU, signups, revenue, top-3 errors. Each with yesterday / 7-day avg.✓ Copied
    → SQL per metric
  2. Run and format
    Run all and format as a Slack-ready digest. Include week-over-week deltas.✓ Copied
    → Slack-ready message

Outcome: Daily metrics without managed BI cost.

Combine with: notion

Combinations

Pair with other MCPs for X10 leverage

bigquery-server + notion

Weekly KPI doc

Run my weekly KPI queries and create a Notion page in 'Metrics Weekly' with results + commentary.✓ Copied
bigquery-server + gateway

PII-safe access via mcp-gateway + Presidio

Put BigQuery MCP behind mcp-gateway with Presidio; verify customer emails get redacted in results.✓ Copied

Tools

What this MCP exposes

ToolInputsWhen to callCost
list_datasets First step to orient free
list_tables dataset Navigate a dataset free
describe_table dataset, table Before querying free
query sql: str, max_bytes?: int Main read tool; limited to 1GB scan by default BQ on-demand: $6.25 per TB scanned

Cost & Limits

What this costs to run

API quota
BigQuery job quotas (generous)
Tokens per call
Query results can be huge — always LIMIT or aggregate
Monetary
Pay GCP by bytes scanned ($6.25/TB on-demand). Configure scan limit in MCP to cap.
Tip
Filter by partition. A full-table scan on a busy fact table = real money. The MCP's byte limit is your safety net.

Security

Permissions, secrets, blast radius

Minimum scopes: bigquery.dataViewer + bigquery.jobUser on specific datasets only
Credential storage: Service account JSON in a mounted path; never commit
Data egress: Query results go to your LLM provider
Never grant: dataOwner / dataEditor to the MCP's service account

Troubleshooting

Common errors and fixes

PERMISSION_DENIED on dataset

SA lacks BQ Data Viewer. gcloud projects add-iam-policy-binding ....

Verify: gcloud bigquery datasets list
Query exceeds configured byte limit

Add partition filter or column projection; or raise limit if legitimately needed.

Restricted field still appearing in results

Regex match may miss aliased columns — use BQ authorized views for hard isolation.

Alternatives

mcp-bigquery-server vs others

AlternativeWhen to use it insteadTradeoff
Looker / MetabaseYou want a BI tool, not chatBetter dashboards; less conversational
postgres MCP via Cloud SQLYour analytical data is in Postgres insteadDifferent engine; Postgres doesn't scale like BQ for big aggregates

More

Resources

📖 Read the official README on GitHub

🐙 Browse open issues

🔍 Browse all 400+ MCP servers and Skills