Add MCP tools to LibreChat / any OpenAI-compatible chat UI
Quando usar: You're running LibreChat, Big-AGI, or a custom app that calls /v1/chat/completions and wants tool use, but it doesn't speak MCP.
Pré-requisitos
- An OpenAI-compatible inference backend — OpenAI, Anthropic-via-proxy, vLLM, Ollama, etc.
- At least one MCP server you want to expose — filesystem, fetch, postgres — whatever you've got
Fluxo
-
Write config.jsonWrite me an MCP-Bridge config.json that proxies OpenAI and exposes filesystem MCP (rooted at /data) and fetch MCP.✓ Copiado→ Valid config with inference_server and mcp_servers sections
-
Run via DockerGive me the docker run command to start MCP-Bridge using this config on port 8000.✓ Copiado→ Working docker command with volume mounts
-
Point the chat UI at the bridgeShow me what API base URL to set in LibreChat to use the bridge instead of OpenAI directly.✓ Copiado→ Config pointing to http://localhost:8000/v1
Resultado: LibreChat conversations can now call filesystem and fetch tools, transparently.
Armadilhas
- Not all OpenAI-compatible clients support tool calls — Verify your UI supports
functionsin responses before wiring; check its docs for 'tool calling' support - Streaming responses not yet implemented — Disable streaming in the client; use non-streaming endpoints