/ Verzeichnis / Playground / mcp-bigquery-server
● Community ergut ⚡ Sofort

mcp-bigquery-server

von ergut · ergut/mcp-bigquery-server

Read-only natural-language BigQuery from Claude — schema exploration, query limits, PII field restrictions — with service account auth.

mcp-bigquery-server is a Node MCP giving LLMs safe, read-only access to BigQuery datasets. Enforces configurable scanned-bytes query limits (default 1GB), supports field-level restrictions for PII/PHI, and can be installed via Smithery or manually configured with service account credentials.

Warum nutzen

Hauptfunktionen

Live-Demo

In der Praxis

bigquery-server.replay ▶ bereit
0/0

Installieren

Wählen Sie Ihren Client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Öffne Claude Desktop → Settings → Developer → Edit Config. Nach dem Speichern neu starten.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Cursor nutzt das gleiche mcpServers-Schema wie Claude Desktop. Projektkonfiguration schlägt die globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Klicken Sie auf das MCP-Servers-Symbol in der Cline-Seitenleiste, dann "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "bigquery-server": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ],
      "_inferred": true
    }
  }
}

Gleiche Struktur wie Claude Desktop. Windsurf neu starten zum Übernehmen.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "bigquery-server",
      "command": "npx",
      "args": [
        "-y",
        "mcp-bigquery-server"
      ]
    }
  ]
}

Continue nutzt ein Array von Serverobjekten statt einer Map.

~/.config/zed/settings.json
{
  "context_servers": {
    "bigquery-server": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "mcp-bigquery-server"
        ]
      }
    }
  }
}

In context_servers hinzufügen. Zed lädt beim Speichern neu.

claude mcp add bigquery-server -- npx -y mcp-bigquery-server

Einzeiler. Prüfen mit claude mcp list. Entfernen mit claude mcp remove.

Anwendungsfälle

Praxisnahe Nutzung: mcp-bigquery-server

Answer product/growth questions from BigQuery without writing SQL

👤 PMs, growth analysts with BQ-backed warehouse ⏱ ~15 min intermediate

Wann einsetzen: You have a question whose answer lives in events tables in BQ.

Voraussetzungen
  • GCP service account with BQ Data Viewer + Job User — IAM > Create service account; download JSON key
Ablauf
  1. Discover tables
    List tables in dataset analytics. Describe events and users.✓ Kopiert
    → Schemas
  2. Ask the question
    How many users who signed up in March 2026 triggered the 'aha_moment' event within 7 days?✓ Kopiert
    → Numeric answer with SQL shown
  3. Caveat
    Any caveats? Timezone, deletion, test users?✓ Kopiert
    → Honest caveats

Ergebnis: Answers in minutes instead of data-team tickets.

Fallstricke
  • Running SELECT * on a huge fact table blows the scan limit — Always filter by partition column (often _PARTITIONDATE)

Let a less-trusted analyst explore data without reading PII rows

👤 Data platform teams ⏱ ~30 min advanced

Wann einsetzen: You want to open BQ access to more people via chat without each of them being able to read customer emails.

Ablauf
  1. Configure restricted fields
    Add config.json entry restricting fields users.email, users.phone, users.ssn. Agent can only aggregate these, not SELECT them raw.✓ Kopiert
    → Config in place
  2. Test
    Run SELECT email FROM users LIMIT 10. Verify it's blocked. Then run SELECT domain, COUNT(*) FROM users GROUP BY domain — verify it works.✓ Kopiert
    → Block on raw read; allow on aggregate

Ergebnis: Safer self-service analytics for the LLM era.

Fallstricke
  • Regex-based field detection can miss complex aliased SQL — Defense in depth — also use BQ column-level security / authorized views
Kombinieren mit: gateway

Auto-compile a daily metrics digest from BQ

👤 PMs, founders ⏱ ~30 min intermediate

Wann einsetzen: You want KPIs in Slack every morning without a BI tool.

Ablauf
  1. Define the metrics
    Define queries for: DAU, signups, revenue, top-3 errors. Each with yesterday / 7-day avg.✓ Kopiert
    → SQL per metric
  2. Run and format
    Run all and format as a Slack-ready digest. Include week-over-week deltas.✓ Kopiert
    → Slack-ready message

Ergebnis: Daily metrics without managed BI cost.

Kombinieren mit: notion

Kombinationen

Mit anderen MCPs für 10-fache Wirkung

bigquery-server + notion

Weekly KPI doc

Run my weekly KPI queries and create a Notion page in 'Metrics Weekly' with results + commentary.✓ Kopiert
bigquery-server + gateway

PII-safe access via mcp-gateway + Presidio

Put BigQuery MCP behind mcp-gateway with Presidio; verify customer emails get redacted in results.✓ Kopiert

Werkzeuge

Was dieses MCP bereitstellt

WerkzeugEingabenWann aufrufenKosten
list_datasets First step to orient free
list_tables dataset Navigate a dataset free
describe_table dataset, table Before querying free
query sql: str, max_bytes?: int Main read tool; limited to 1GB scan by default BQ on-demand: $6.25 per TB scanned

Kosten & Limits

Was der Betrieb kostet

API-Kontingent
BigQuery job quotas (generous)
Tokens pro Aufruf
Query results can be huge — always LIMIT or aggregate
Kosten in €
Pay GCP by bytes scanned ($6.25/TB on-demand). Configure scan limit in MCP to cap.
Tipp
Filter by partition. A full-table scan on a busy fact table = real money. The MCP's byte limit is your safety net.

Sicherheit

Rechte, Secrets, Reichweite

Minimale Scopes: bigquery.dataViewer + bigquery.jobUser on specific datasets only
Credential-Speicherung: Service account JSON in a mounted path; never commit
Datenabfluss: Query results go to your LLM provider
Niemals gewähren: dataOwner / dataEditor to the MCP's service account

Fehlerbehebung

Häufige Fehler und Lösungen

PERMISSION_DENIED on dataset

SA lacks BQ Data Viewer. gcloud projects add-iam-policy-binding ....

Prüfen: gcloud bigquery datasets list
Query exceeds configured byte limit

Add partition filter or column projection; or raise limit if legitimately needed.

Restricted field still appearing in results

Regex match may miss aliased columns — use BQ authorized views for hard isolation.

Alternativen

mcp-bigquery-server vs. andere

AlternativeWann stattdessenKompromiss
Looker / MetabaseYou want a BI tool, not chatBetter dashboards; less conversational
postgres MCP via Cloud SQLYour analytical data is in Postgres insteadDifferent engine; Postgres doesn't scale like BQ for big aggregates

Mehr

Ressourcen

📖 Offizielle README auf GitHub lesen

🐙 Offene Issues ansehen

🔍 Alle 400+ MCP-Server und Skills durchsuchen