/ Directory / Playground / DBHub
● Official bytebase 🔑 Needs your key

DBHub

by bytebase · bytebase/dbhub

One MCP, many databases — Postgres, MySQL, SQL Server, SQLite, Oracle — in a read-only-by-default query interface.

Bytebase's DBHub is a zero-dependency MCP that speaks to multiple relational databases through a single npx @bytebase/dbhub binary. Pass the DSN for your DB flavor, get schema browsing, table sampling, and SQL execution. Runs in read-only mode by default, making it safe for exploratory production sessions.

Why use it

Key features

Live Demo

What it looks like in practice

dbhub.replay ▶ ready
0/0

Install

Pick your client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "dbhub": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub"
      ]
    }
  }
}

Open Claude Desktop → Settings → Developer → Edit Config. Restart after saving.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "dbhub": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub"
      ]
    }
  }
}

Cursor uses the same mcpServers schema as Claude Desktop. Project config wins over global.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "dbhub": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub"
      ]
    }
  }
}

Click the MCP Servers icon in the Cline sidebar, then "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "dbhub": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub"
      ]
    }
  }
}

Same shape as Claude Desktop. Restart Windsurf to pick up changes.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "dbhub",
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub"
      ]
    }
  ]
}

Continue uses an array of server objects rather than a map.

~/.config/zed/settings.json
{
  "context_servers": {
    "dbhub": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "@bytebase/dbhub"
        ]
      }
    }
  }
}

Add to context_servers. Zed hot-reloads on save.

claude mcp add dbhub -- npx -y @bytebase/dbhub

One-liner. Verify with claude mcp list. Remove with claude mcp remove.

Use Cases

Real-world ways to use DBHub

Query 3 different databases in one session

👤 Engineers whose stack has >1 relational DB ⏱ ~20 min intermediate

When to use: Your stack has Postgres for primary data, MySQL for a legacy service, and SQL Server for a reporting copy — and you want one AI assistant across all.

Prerequisites
  • DSN for each DB with read-only creds — postgres://, mysql://, sqlserver://, sqlite://, oracle:// formats
Flow
  1. Configure multiple DSNs
    Show me which DB I'm currently pointed at. If needed, switch to the MySQL DSN.✓ Copied
    → Clear active-DB indicator
  2. Inspect schema
    List tables in the current DB with approximate row counts.✓ Copied
    → Table catalog
  3. Cross-reference across DBs
    Query Postgres for user emails, then query MySQL legacy_users for the same emails, tell me who's in one but not the other.✓ Copied
    → Reconciliation report

Outcome: A single workflow across heterogeneous DBs without juggling different MCP servers.

Pitfalls
  • SQL dialect differences trip Claude (e.g. LIMIT vs TOP) — Tell Claude explicitly which DB flavor the current query targets, or split into DB-specific turns
Combine with: filesystem

Analyze a SQLite file someone sent you

👤 Engineers / analysts handed an opaque .db file ⏱ ~10 min beginner

When to use: A customer sent a sqlite dump and wants it eyeballed.

Flow
  1. Point DBHub at the file
    Use DSN sqlite:///path/to/data.db. List tables + row counts.✓ Copied
    → Table inventory
  2. Sample each
    For each non-trivial table, show 5 sample rows and infer the purpose.✓ Copied
    → Per-table summary
  3. Answer the customer's question
    Customer asks: <question>. Write SQL, run, return answer.✓ Copied
    → Query + result

Outcome: Fast exploration of an unfamiliar sqlite file without extracting it into another tool.

Pitfalls
  • Large sqlite tables have no indexes — full scans can lock the file — Open with read-only; avoid aggregations across >1M rows in a single query
Combine with: filesystem

Run reporting queries against a read-replica safely

👤 BI / analytics ⏱ ~15 min beginner

When to use: You have a replica for analytics and want AI-driven ad-hoc reports without exposing the primary.

Prerequisites
  • Read-only DSN against the replica — Replica-only credentials; statement_timeout enforced in DSN
Flow
  1. Verify it's the replica
    Confirm the current connection is read-only and points at the replica host.✓ Copied
    → Host string + read-only flag verified
  2. Run the report
    [paste business question]. Translate to SQL, run, return results.✓ Copied
    → Result set
  3. Persist for re-use
    Save this SQL to /reports/<name>.sql with a comment explaining the question.✓ Copied
    → SQL file saved

Outcome: Ad-hoc BI without risk to prod primary.

Pitfalls
  • Heavy queries slow the replica and create replication lag — Set statement_timeout and run big queries off-peak
Combine with: filesystem · antv-chart

Audit SQL Server stored procedures for a migration

👤 Teams migrating off SQL Server ⏱ ~30 min advanced

When to use: You need a list of every stored procedure, its lines of code, and last-modified date.

Flow
  1. List procs
    Query sys.procedures + sys.sql_modules to list all procs with name, schema, lines, and last modified date.✓ Copied
    → Proc inventory
  2. Classify complexity
    Bucket procs by line count: trivial (<50), medium (50-300), complex (>300). Count each bucket.✓ Copied
    → Complexity histogram
  3. Surface MSSQL-specific features
    For complex procs, flag usage of MSSQL-specific constructs (CROSS APPLY, CTE recursion, TOP, GETDATE) — these are the hard migration items.✓ Copied
    → Migration-risk list

Outcome: A stored-procedure migration plan grounded in real counts.

Pitfalls
  • Some procs contain dynamic SQL that's hard to classify — Flag any proc with EXEC sp_executesql for manual review
Combine with: filesystem

Combinations

Pair with other MCPs for X10 leverage

dbhub + antv-chart

Run SQL then chart the result directly

Query weekly revenue from the Postgres replica via DBHub, then render as an AntV line chart.✓ Copied
dbhub + filesystem

Save queries + results for reproducibility

Run the weekly KPI query, save SQL to /sql/weekly.sql and result CSV to /data/weekly-<date>.csv.✓ Copied
dbhub + notion

Post a SQL-backed report to Notion

Run the top-customers query, create a Notion page with the result as a table.✓ Copied

Tools

What this MCP exposes

ToolInputsWhen to callCost
list_databases First exploration step free
list_tables database? Catalog before queries free
describe_table table, schema? Inspect schema before querying free
execute_sql sql, params? Read or write SQL (write requires flag) depends on query
execute_read_sql sql, params? Explicit read-only execution depends

Cost & Limits

What this costs to run

API quota
Bounded by your DB connection limits
Tokens per call
Depends on result size; cap with LIMIT
Monetary
Free — costs are your DB hosting only
Tip
Set a statement_timeout in the DSN; AI-written queries can be enthusiastic about full scans.

Security

Permissions, secrets, blast radius

Minimum scopes: SELECT on target tables
Credential storage: DSN in env (DSN or per-flavor env var)
Data egress: Direct to your DB; no third-party proxy
Never grant: CREATE/DROP/ALTER in the connection role unless needed for the session

Troubleshooting

Common errors and fixes

Authentication failed / access denied

DSN credentials wrong or lacking SELECT. Recheck DSN format for each flavor.

Verify: Connect with the DB's native client using the same DSN
Unsupported SQL feature / syntax error

Flavor mismatch — tell Claude which DB dialect is active, or re-check the DSN prefix.

Connection pool exhausted

Lower concurrency or increase pool size; long-running queries are usually the real cause.

Writes rejected (read-only)

DBHub is in default read-only. Restart with --readonly=false for this session.

Alternatives

DBHub vs others

AlternativeWhen to use it insteadTradeoff
Postgres MCPYou only use Postgres; deeper Postgres-specific featuresSingle-flavor
MongoDB MCPYou need Mongo alongside relationalDifferent data model
Supabase MCPYou're on Supabase and want project+DB managementTied to Supabase

More

Resources

📖 Read the official README on GitHub

🐙 Browse open issues

🔍 Browse all 400+ MCP servers and Skills