/ Directorio / Playground / MCP-Bridge
● Comunidad SecretiveShell ⚡ Instantáneo

MCP-Bridge

por SecretiveShell · SecretiveShell/MCP-Bridge

Use MCP tools from any OpenAI-compatible client — LibreChat, Open WebUI, your custom app — without native MCP support. Middleware that translates.

MCP-Bridge sits between your OpenAI-compatible client and inference backend. It advertises MCP server tools as OpenAI function-calling tools, dispatches calls, and returns results to complete the loop. Useful when your favorite chat UI doesn't speak MCP but speaks OpenAI.

Por qué usarlo

Características clave

Demo en vivo

Cómo se ve en la práctica

bridge.replay ▶ listo
0/0

Instalar

Elige tu cliente

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "bridge": {
      "command": "uvx",
      "args": [
        "MCP-Bridge"
      ],
      "_inferred": true
    }
  }
}

Abre Claude Desktop → Settings → Developer → Edit Config. Reinicia después de guardar.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "bridge": {
      "command": "uvx",
      "args": [
        "MCP-Bridge"
      ],
      "_inferred": true
    }
  }
}

Cursor usa el mismo esquema mcpServers que Claude Desktop. La configuración del proyecto prevalece sobre la global.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "bridge": {
      "command": "uvx",
      "args": [
        "MCP-Bridge"
      ],
      "_inferred": true
    }
  }
}

Haz clic en el icono MCP Servers de la barra lateral de Cline y luego en "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "bridge": {
      "command": "uvx",
      "args": [
        "MCP-Bridge"
      ],
      "_inferred": true
    }
  }
}

Mismo formato que Claude Desktop. Reinicia Windsurf para aplicar.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "bridge",
      "command": "uvx",
      "args": [
        "MCP-Bridge"
      ]
    }
  ]
}

Continue usa un array de objetos de servidor en lugar de un mapa.

~/.config/zed/settings.json
{
  "context_servers": {
    "bridge": {
      "command": {
        "path": "uvx",
        "args": [
          "MCP-Bridge"
        ]
      }
    }
  }
}

Añádelo a context_servers. Zed recarga en caliente al guardar.

claude mcp add bridge -- uvx MCP-Bridge

Un solo comando. Verifica con claude mcp list. Quita con claude mcp remove.

Casos de uso

Usos del mundo real: MCP-Bridge

Add MCP tools to LibreChat / any OpenAI-compatible chat UI

👤 Self-hosters of OSS chat frontends ⏱ ~30 min intermediate

Cuándo usarlo: You're running LibreChat, Big-AGI, or a custom app that calls /v1/chat/completions and wants tool use, but it doesn't speak MCP.

Requisitos previos
  • An OpenAI-compatible inference backend — OpenAI, Anthropic-via-proxy, vLLM, Ollama, etc.
  • At least one MCP server you want to expose — filesystem, fetch, postgres — whatever you've got
Flujo
  1. Write config.json
    Write me an MCP-Bridge config.json that proxies OpenAI and exposes filesystem MCP (rooted at /data) and fetch MCP.✓ Copiado
    → Valid config with inference_server and mcp_servers sections
  2. Run via Docker
    Give me the docker run command to start MCP-Bridge using this config on port 8000.✓ Copiado
    → Working docker command with volume mounts
  3. Point the chat UI at the bridge
    Show me what API base URL to set in LibreChat to use the bridge instead of OpenAI directly.✓ Copiado
    → Config pointing to http://localhost:8000/v1

Resultado: LibreChat conversations can now call filesystem and fetch tools, transparently.

Errores comunes
  • Not all OpenAI-compatible clients support tool calls — Verify your UI supports functions in responses before wiring; check its docs for 'tool calling' support
  • Streaming responses not yet implemented — Disable streaming in the client; use non-streaming endpoints
Combinar con: filesystem · fetch

Give your own Python/JS agent framework MCP tool access

👤 Devs building custom agents on OpenAI SDK ⏱ ~25 min intermediate

Cuándo usarlo: You're building with the raw OpenAI SDK (or LangChain's OpenAI client) and want to plug in the MCP ecosystem without rewriting the agent.

Flujo
  1. Start MCP-Bridge locally
    Run MCP-Bridge with upstream set to OpenAI and these MCP servers: [list].✓ Copiado
    → Bridge listening on :8000
  2. Point OpenAI client base_url at the bridge
    Show me Python SDK init: client = OpenAI(base_url='http://localhost:8000/v1', api_key=...). Then call chat completions.✓ Copiado
    → Code snippet that works unchanged

Resultado: Zero-touch tool access for your existing agent code.

Errores comunes
  • Bridge is a single point of failure — For prod, run with supervisord/systemd and healthcheck endpoint

Combinaciones

Combínalo con otros MCPs para multiplicar por 10

bridge + filesystem + fetch

Budget self-hosted ChatGPT replacement with real tool use

Expose filesystem (rooted at ~/Notes) and fetch via MCP-Bridge, then use LibreChat to browse + summarize.✓ Copiado

Herramientas

Lo que expone este MCP

HerramientaEntradasCuándo llamarCoste
POST /v1/chat/completions OpenAI-compatible messages + tools omitted (auto-injected) Main entrypoint — drop-in for OpenAI 1 LLM call + N tool calls
GET /tools Discover what's available free
SSE /bridge Attach an external MCP client to the bridge over SSE free

Coste y límites

Lo que cuesta ejecutarlo

Cuota de API
Pass-through — whatever your upstream inference provider charges
Tokens por llamada
Bridge adds ~100-500 tokens of tool definitions per request
Monetario
Free (MIT). You pay for your LLM + wherever you host it.
Consejo
Only attach MCP servers you need — every attached tool bloats the system prompt.

Seguridad

Permisos, secretos, alcance

Almacenamiento de credenciales: Upstream API key + MCP server creds in config.json; lock down file permissions
Salida de datos: Requests go to your configured upstream (e.g. OpenAI) + whichever MCP servers
No conceder nunca: Expose the bridge to the internet without enabling bearer auth

Resolución de problemas

Errores comunes y soluciones

Client says 'tool_use not supported'

Upstream model or client UI doesn't support function calling. Use a model that does (gpt-4o, claude, llama 3.1+).

MCP server connection refused

Check the command in config.json actually runs. Bridge runs it as subprocess; test manually: npx -y the-mcp.

401 from bridge when auth enabled

Set Authorization: Bearer <key> header; the key must be in config under security.auth.keys.

Alternativas

MCP-Bridge vs otros

AlternativaCuándo usarlaContrapartida
Open WebUI native MCPYou specifically use Open WebUI 0.6.31+Built-in — no bridge needed, but Open WebUI only
LiteLLM with custom callbacksYou want multi-provider routing + tool injectionMore complex; LiteLLM doesn't natively speak MCP either
mcpoYou want to expose MCP tools as plain OpenAPI for non-LLM clients tooDifferent shape — OpenAPI-first rather than chat-completions-first

Más

Recursos

📖 Lee el README oficial en GitHub

🐙 Ver issues abiertas

🔍 Ver todos los 400+ servidores MCP y Skills