Add MCP tools to LibreChat / any OpenAI-compatible chat UI
When to use: You're running LibreChat, Big-AGI, or a custom app that calls /v1/chat/completions and wants tool use, but it doesn't speak MCP.
Prerequisites
- An OpenAI-compatible inference backend — OpenAI, Anthropic-via-proxy, vLLM, Ollama, etc.
- At least one MCP server you want to expose — filesystem, fetch, postgres — whatever you've got
Flow
-
Write config.jsonWrite me an MCP-Bridge config.json that proxies OpenAI and exposes filesystem MCP (rooted at /data) and fetch MCP.✓ Copied→ Valid config with inference_server and mcp_servers sections
-
Run via DockerGive me the docker run command to start MCP-Bridge using this config on port 8000.✓ Copied→ Working docker command with volume mounts
-
Point the chat UI at the bridgeShow me what API base URL to set in LibreChat to use the bridge instead of OpenAI directly.✓ Copied→ Config pointing to http://localhost:8000/v1
Outcome: LibreChat conversations can now call filesystem and fetch tools, transparently.
Pitfalls
- Not all OpenAI-compatible clients support tool calls — Verify your UI supports
functionsin responses before wiring; check its docs for 'tool calling' support - Streaming responses not yet implemented — Disable streaming in the client; use non-streaming endpoints