/ Verzeichnis / Playground / db-mcp-server
● Community FreePeak ⚡ Sofort

db-mcp-server

von FreePeak · FreePeak/db-mcp-server

One MCP server, many databases — MySQL, Postgres, SQLite, Oracle, TimescaleDB. Each connection gets its own query/schema/perf toolset.

db-mcp-server (FreePeak) connects to multiple databases simultaneously. For each configured connection it auto-generates query/execute/transaction/schema/performance tools. Supports TimescaleDB (hypertables, continuous aggregates) and Oracle specifics (RAC, wallet).

Warum nutzen

Hauptfunktionen

Live-Demo

In der Praxis

db.replay ▶ bereit
0/0

Installieren

Wählen Sie Ihren Client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "db": {
      "command": "TODO",
      "args": [
        "See README: https://github.com/FreePeak/db-mcp-server"
      ],
      "_inferred": true
    }
  }
}

Öffne Claude Desktop → Settings → Developer → Edit Config. Nach dem Speichern neu starten.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "db": {
      "command": "TODO",
      "args": [
        "See README: https://github.com/FreePeak/db-mcp-server"
      ],
      "_inferred": true
    }
  }
}

Cursor nutzt das gleiche mcpServers-Schema wie Claude Desktop. Projektkonfiguration schlägt die globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "db": {
      "command": "TODO",
      "args": [
        "See README: https://github.com/FreePeak/db-mcp-server"
      ],
      "_inferred": true
    }
  }
}

Klicken Sie auf das MCP-Servers-Symbol in der Cline-Seitenleiste, dann "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "db": {
      "command": "TODO",
      "args": [
        "See README: https://github.com/FreePeak/db-mcp-server"
      ],
      "_inferred": true
    }
  }
}

Gleiche Struktur wie Claude Desktop. Windsurf neu starten zum Übernehmen.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "db",
      "command": "TODO",
      "args": [
        "See README: https://github.com/FreePeak/db-mcp-server"
      ]
    }
  ]
}

Continue nutzt ein Array von Serverobjekten statt einer Map.

~/.config/zed/settings.json
{
  "context_servers": {
    "db": {
      "command": {
        "path": "TODO",
        "args": [
          "See README: https://github.com/FreePeak/db-mcp-server"
        ]
      }
    }
  }
}

In context_servers hinzufügen. Zed lädt beim Speichern neu.

claude mcp add db -- TODO 'See README: https://github.com/FreePeak/db-mcp-server'

Einzeiler. Prüfen mit claude mcp list. Entfernen mit claude mcp remove.

Anwendungsfälle

Praxisnahe Nutzung: db-mcp-server

Run cross-database analysis from Claude

👤 Data engineers ⏱ ~25 min intermediate

Wann einsetzen: You need to pull from Postgres (app) AND MySQL (legacy) in one conversation without switching tools.

Voraussetzungen
  • config.json with both connections — Repo docs show the shape; store credentials in env refs, not inline
Ablauf
  1. Start the server
    Run ./bin/server -t sse -c config.json and confirm both connections come up.✓ Kopiert
    → Server logs 2 connections ok
  2. Query each
    From prod (Postgres): users signed up last week. From legacy (MySQL): orders attributed to them. Join in memory.✓ Kopiert
    → Combined dataset

Ergebnis: Cross-system insights without a data warehouse.

Fallstricke
  • Read-only assumption breaks — agent runs INSERTs — Use DB-level read-only users per connection; don't rely on agent discipline
Kombinieren mit: google-sheets

Manage TimescaleDB hypertables from Claude

👤 Observability / IoT teams ⏱ ~20 min advanced

Wann einsetzen: You want to create/inspect hypertables and continuous aggregates without memorizing Timescale DDL.

Ablauf
  1. Inspect existing hypertables
    List all hypertables in metrics DB with chunk interval and row count.✓ Kopiert
    → Table of hypertables
  2. Create a continuous aggregate
    Create a 1-hour rollup on sensor_readings grouping by device_id, avg + max + min.✓ Kopiert
    → CAgg created; refresh policy configured

Ergebnis: Timescale ops in minutes, not Google searches.

Explain an unfamiliar schema to get onboarded fast

👤 Engineers inheriting a database ⏱ ~30 min beginner

Wann einsetzen: Day 1 at a new team; you need a map of the DB.

Ablauf
  1. Dump schema
    Use schema_<conn_id> on prod. Return tables, FK graph, and row-count order.✓ Kopiert
    → Schema + FK map
  2. Generate a glossary
    For each table, infer a 1-line description from column names and sample rows (limit 5 per table).✓ Kopiert
    → Onboarding cheat sheet

Ergebnis: Working mental model in 30 min.

Kombinationen

Mit anderen MCPs für 10-fache Wirkung

Export a query result into a shared Sheet for non-tech stakeholders

Query top 50 customers by LTV from prod; write to the 'Top LTV' sheet.✓ Kopiert

Cross-check DB slow-query findings with DB-level Prometheus metrics

For the slow query found via performance_prod, show the pg_stat_statements metrics from Prometheus for the same window.✓ Kopiert

Werkzeuge

Was dieses MCP bereitstellt

WerkzeugEingabenWann aufrufenKosten
query_<db_id> sql: str (SELECT) Read data 1 query
execute_<db_id> sql: str (DDL/DML) Writes — guarded by DB permissions 1 query
transaction_<db_id> statements: str[] Multi-statement atomic changes 1 tx
schema_<db_id> table?: str Discovery / onboarding metadata query
generate_schema_<db_id> format: sql|json Export for docs/version control metadata queries
performance_<db_id> sql?: str Tune a slow query plan + stats

Kosten & Limits

Was der Betrieb kostet

API-Kontingent
Your DBs' capacity
Tokens pro Aufruf
Large result sets burn tokens fast — LIMIT aggressively
Kosten in €
Free MCP; DB hosting costs are yours
Tipp
Always add LIMIT / cap output; stream to a file via filesystem MCP for bigger pulls

Sicherheit

Rechte, Secrets, Reichweite

Minimale Scopes: DB-level read-only user recommended for exploration
Credential-Speicherung: config.json references env vars; never commit with inline passwords
Datenabfluss: Only to configured DB hosts
Niemals gewähren: DB superuser to the MCP connection unless absolutely needed

Fehlerbehebung

Häufige Fehler und Lösungen

Connection pool exhausted

Tune pool size in config.json; kill zombie sessions on DB side; verify no runaway agent loops

Prüfen: SELECT * FROM pg_stat_activity (for Postgres)
Oracle wallet auth fails

TNS_ADMIN path must be readable by the MCP process; on mac/linux watch for SELinux/AppArmor

Tool names don't appear for one DB

That connection likely failed to init; check server logs — usually wrong creds or firewall

Alternativen

db-mcp-server vs. andere

AlternativeWann stattdessenKompromiss
postgres-mcp (official)You only need PostgresSingle-DB
mysql-mcp (community)You only need MySQLSingle-DB

Mehr

Ressourcen

📖 Offizielle README auf GitHub lesen

🐙 Offene Issues ansehen

🔍 Alle 400+ MCP-Server und Skills durchsuchen