/ Каталог / Песочница / Postgres
● Официальный modelcontextprotocol 🔑 Нужен свой ключ

Postgres

автор modelcontextprotocol · modelcontextprotocol/servers

Let Claude query your Postgres database with read-only safety, full schema introspection, and EXPLAIN plans — without ever giving it write access.

The reference Postgres MCP server. Connects with a standard postgres:// URL, exposes schema browsing, query execution, and EXPLAIN. Strictly read-only — no INSERT/UPDATE/DELETE/DDL — making it safe to point at production read replicas.

Зачем использовать

Ключевые функции

Живое демо

Как выглядит на практике

postgres.replay ▶ готово
0/0

Установка

Выберите клиент

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://..."
      ]
    }
  }
}

Откройте Claude Desktop → Settings → Developer → Edit Config. Перезапустите после сохранения.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://..."
      ]
    }
  }
}

Cursor использует ту же схему mcpServers, что и Claude Desktop. Конфиг проекта приоритетнее глобального.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://..."
      ]
    }
  }
}

Щёлкните значок MCP Servers на боковой панели Cline, затем "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://..."
      ]
    }
  }
}

Тот же формат, что и Claude Desktop. Перезапустите Windsurf для применения.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "postgres",
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://..."
      ]
    }
  ]
}

Continue использует массив объектов серверов, а не map.

~/.config/zed/settings.json
{
  "context_servers": {
    "postgres": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "@modelcontextprotocol/server-postgres",
          "postgresql://..."
        ]
      }
    }
  }
}

Добавьте в context_servers. Zed перезагружается автоматически.

claude mcp add postgres -- npx -y @modelcontextprotocol/server-postgres postgresql://...

Однострочная команда. Проверить: claude mcp list. Удалить: claude mcp remove.

Сценарии использования

Реальные сценарии: Postgres

Answer ad-hoc business questions without touching SQL

👤 PMs, founders, anyone who'd rather not write SQL ⏱ ~10 min beginner

Когда использовать: You have a question about your data ('how many users came back this week?') and the BI dashboard doesn't have it.

Предварительные требования
  • Read-only postgres:// connection string to a replica — Most managed PG (RDS, Neon, Supabase) lets you create read-only credentials
  • Network access from where Claude runs to the DB — VPN or IP allowlist your machine
Поток
  1. Have Claude introspect the relevant tables first
    List all tables in our DB. For tables related to users, orders, or sessions, describe their schemas.✓ Скопировано
    → Schema overview before any query
  2. Ask the actual question
    How many users signed up in the last 30 days but haven't placed an order yet? Group by signup week.✓ Скопировано
    → Claude writes SQL, runs it, returns results table
  3. Probe for caveats
    Are there any reasons this number could be misleading? Soft deletes? Timezone in created_at? Specific user types we should exclude?✓ Скопировано
    → Honest call-out of data quirks

Итог: A defensible answer to a business question with the SQL, the result, and the caveats — in 2 minutes instead of waiting 2 days for the data team.

Подводные камни
  • Claude writes a query that scans your largest table without limits — Set statement_timeout = '30s' on the connection, and add 'always include LIMIT 1000 by default' to your system prompt
  • Counting 'users' depends on what counts as a user (deleted? bot? test?) — Tell Claude your conventions upfront: 'exclude rows where deleted_at IS NOT NULL' etc.
Сочетать с: notion

Diagnose why a query is slow and suggest indexes

👤 Backend engineers, DBAs ⏱ ~15 min intermediate

Когда использовать: You have a query that's slower than it should be. You want a second pair of eyes that doesn't get tired reading EXPLAIN ANALYZE output.

Поток
  1. Get the query plan
    Run EXPLAIN ANALYZE on this query: [paste]. Walk me through what the planner is doing.✓ Скопировано
    → Step-by-step plan walkthrough
  2. Identify the cost driver
    Which step is responsible for most of the cost? Is it a sequential scan, a bad join order, or expensive filtering?✓ Скопировано
    → Specific node identified with reason
  3. Suggest an index or rewrite
    Suggest the smallest change to make this fast. Prefer adding an index over rewriting the query, but only if the index would be useful for >1 query.✓ Скопировано
    → Concrete CREATE INDEX statement OR rewritten query

Итог: An indexed-or-rewritten query, with reasoning, that you can verify by running EXPLAIN again.

Подводные камни
  • EXPLAIN on a non-representative dataset (small dev DB) gives misleading plans — Always run EXPLAIN against a database with production-shaped data; otherwise the plan is fiction
  • Adding an index seems free but slows down writes — Tell Claude to verify the index would be used by checking with EXPLAIN BEFORE asking you to add it
Сочетать с: sentry

Audit a table for data quality issues

👤 Data engineers, anyone inheriting an unfamiliar schema ⏱ ~25 min intermediate

Когда использовать: You're about to build a feature on top of a table you didn't design and you suspect it has problems.

Поток
  1. Run a battery of NULL / duplicate / orphan checks
    For the orders table: count NULL values per column, count duplicate rows by some natural key (e.g. (user_id, stripe_payment_intent_id)), count rows with foreign keys pointing to deleted parent rows.✓ Скопировано
    → Issue counts per check
  2. Check for value distribution oddities
    What are the min, max, and percentile distribution for total_cents? Are there suspiciously many rows with 0 or negative values?✓ Скопировано
    → Distribution stats, outliers flagged
  3. Cross-check against expected business rules
    Every 'completed' order should have a non-null paid_at. Are there exceptions?✓ Скопировано
    → Count of violations + sample IDs

Итог: A short list of concrete data integrity bugs, each with a count and a fix path.

Подводные камни
  • Some 'issues' are intentional historical artifacts (data migrations) — Always confirm with someone who knows the history before assuming it's a bug

Auto-generate schema documentation for your team

👤 Tech leads onboarding new engineers ⏱ ~20 min beginner

Когда использовать: Your DB has 40 tables, the wiki has 0. New hires keep asking 'what's this column?'

Поток
  1. Get all tables and their schemas
    List every table in the public schema. For each, give me columns, types, nullability, defaults, and any foreign keys.✓ Скопировано
    → Complete schema dump
  2. Infer purpose from naming + sample data
    For each table, sample 3 rows and write a one-paragraph description of what this table represents in our business.✓ Скопировано
    → Per-table prose explanation
  3. Flag unknown / suspicious tables
    Are there any tables that look unused, or that you can't infer a purpose for? List them so I can ask the original author.✓ Скопировано
    → Honest 'I don't know what these are' list

Итог: A Markdown doc your team can drop into Notion or a wiki — covering 80% of what new hires need to know.

Подводные камни
  • Sampling sensitive data (PII) into the LLM context — For tables with PII, ask Claude to describe schemas only without sampling rows
Сочетать с: notion · filesystem

Compute A/B test results from raw event data

👤 Product analysts, growth engineers ⏱ ~30 min advanced

Когда использовать: You ran an experiment, the data is in your DB, and you want significance numbers without writing SQL by hand.

Предварительные требования
  • Events table with experiment assignment + conversion events — Standard schema: events(user_id, experiment, variant, timestamp), conversions(user_id, type, timestamp)
Поток
  1. Compute conversion rate per variant
    For experiment 'checkout-redesign-2026': how many users were assigned to each variant, and what was the conversion rate (by [your conversion event]) per variant?✓ Скопировано
    → Per-variant table with rates
  2. Calculate statistical significance
    Compute the chi-squared p-value for the difference between control and treatment. Is the result statistically significant at p < 0.05?✓ Скопировано
    → p-value with verdict
  3. Sanity check the numbers
    Are sample sizes balanced? Did the experiment run long enough? Any segments where the result reverses?✓ Скопировано
    → Health check, not just p-value

Итог: A statistically defensible A/B test readout with the SQL, the numbers, and the caveats.

Подводные камни
  • Peeking at results before pre-defined sample size leads to false positives — Make Claude check whether the test reached its target sample size before computing significance
Сочетать с: notion

Комбинации

Сочетайте с другими MCP — эффект x10

postgres + notion

Run a query, post results as a Notion table for stakeholders who don't have DB access

Query our top 10 customers by lifetime revenue this quarter, then create a Notion page in 'Sales Reports' with the results as a formatted table.✓ Скопировано
postgres + sentry

Cross-reference DB state with errors — when an error mentions a record ID, look it up

Sentry issue WEB-3a91 mentions order_id 99214. Look up that order and tell me if anything in the row data could explain the crash.✓ Скопировано
postgres + filesystem

Export query results as CSV/JSON for downstream use

Run my churn-cohort query and save the result as /reports/churn-2026-04.csv.✓ Скопировано

Инструменты

Что предоставляет этот MCP

ИнструментВходные данныеКогда вызыватьСтоимость
list_tables schema?: str First step in any session — discover the schema free
describe_table table: str, schema?: str Get full structure of a specific table before querying free
query sql: str Run any read-only SQL — SELECT only depends on query

Стоимость и лимиты

Во что обходится

Квота API
Bounded by your DB's connection limit and query timeout
Токенов на вызов
Schema queries: ~500 tokens. Result sets: depends on row count — cap with LIMIT
Деньги
Free — costs are whatever your DB hosting bill already is
Совет
Always set a statement_timeout on the connection (e.g. ?options=-c%20statement_timeout%3D30000) so a runaway query can't take down your DB.

Безопасность

Права, секреты, радиус поражения

Минимальные скоупы: SELECT on the tables you want exposed
Хранение учётных данных: Connection string in env var. Use a dedicated read-only role: CREATE ROLE claude_readonly LOGIN PASSWORD '...'; GRANT SELECT ON ALL TABLES IN SCHEMA public TO claude_readonly;
Исходящий трафик: All queries to your DB; result rows shipped to whatever LLM provider you use
Никогда не давайте: INSERT UPDATE DELETE DROP TRUNCATE ALTER

Устранение неполадок

Частые ошибки и исправления

FATAL: password authentication failed

Check the connection string. Common cause: special chars in password not URL-encoded.

Проверить: psql 'postgres://...' -c 'SELECT 1'
no pg_hba.conf entry / SSL required

Append ?sslmode=require to the connection string. Most managed Postgres requires SSL.

permission denied for table X

The role doesn't have SELECT on that table. Run GRANT SELECT ON X TO claude_readonly.

Проверить: psql -c '\dp X'
canceling statement due to statement timeout

The query was too slow. Either optimize it (add an index, narrow the WHERE clause), or raise the timeout for that one connection.

Альтернативы

Postgres в сравнении

АльтернативаКогда использоватьКомпромисс
Supabase MCPYou're on Supabase — get full project management plus SQLIncludes write access; less safe for prod
Neon MCPYou're on Neon — adds branching for safe migration testingNeon-specific features only work on Neon DBs
dbHubYou need multi-database support (Postgres, MySQL, MongoDB, etc.) in one MCPNewer; supports more DBs but each integration is shallower
sqlite MCPLocal file-based DB instead of a serverNo concurrent access, no network, but zero setup

Ещё

Ресурсы

📖 Читать официальный README на GitHub

🐙 Открытые задачи

🔍 Все 400+ MCP-серверов и Skills