/ Diretório / Playground / mcp-client-for-ollama
● Comunidade jonigl ⚡ Instantâneo

mcp-client-for-ollama

por jonigl · jonigl/mcp-client-for-ollama

TUI client that connects local Ollama models to any MCP server — agent mode, multi-server, human-in-the-loop, all from your terminal.

mcp-client-for-ollama (ollmcp) is a terminal UI that bridges Ollama's local LLMs with MCP servers. It supports agent mode (iterative tool execution), multi-server connections (STDIO/SSE/HTTP), human-in-the-loop approval, model switching, thinking mode, and performance metrics. For developers who want MCP tool-use without paying for cloud APIs.

Por que usar

Principais recursos

Demo ao vivo

Como fica na prática

client-for-ollama.replay ▶ pronto
0/0

Instalar

Escolha seu cliente

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Abra Claude Desktop → Settings → Developer → Edit Config. Reinicie após salvar.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Cursor usa o mesmo esquema mcpServers que o Claude Desktop. Config de projeto vence a global.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Clique no ícone MCP Servers na barra lateral do Cline, depois "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Mesmo formato do Claude Desktop. Reinicie o Windsurf para aplicar.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "client-for-ollama",
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ]
    }
  ]
}

O Continue usa um array de objetos de servidor em vez de um map.

~/.config/zed/settings.json
{
  "context_servers": {
    "client-for-ollama": {
      "command": {
        "path": "uvx",
        "args": [
          "mcp-client-for-ollama"
        ]
      }
    }
  }
}

Adicione em context_servers. Zed recarrega automaticamente ao salvar.

claude mcp add client-for-ollama -- uvx mcp-client-for-ollama

Uma linha só. Verifique com claude mcp list. Remova com claude mcp remove.

Casos de uso

Usos do mundo real: mcp-client-for-ollama

How to use MCP tools with local LLMs for free

👤 Developers who want MCP functionality without cloud API costs ⏱ ~15 min beginner

Quando usar: You have MCP servers configured but want to use them with a local model instead of Claude or GPT.

Pré-requisitos
  • Ollama installed and running — ollama.com — install, then ollama pull llama3.2:3b
  • ollmcp installed — pip install ollmcp
Fluxo
  1. Launch with auto-discovery
    ollmcp --auto-discovery --model llama3.2:3b✓ Copiado
    → TUI launches, shows discovered MCP servers from Claude config
  2. Test a tool call
    List the files in my current directory.✓ Copiado
    → Model calls the filesystem MCP tool and returns results
  3. Enable agent mode for multi-step tasks
    Type /agent to enable agent mode, then: 'Find all TODO comments in this project and summarize them.'✓ Copiado
    → Model iterates: searches files, reads matches, produces summary

Resultado: Working MCP tool-use powered entirely by a local model — zero API cost.

Armadilhas
  • Small models (3B) struggle with complex tool-use chains — Use 7B+ models for agent mode; 3B is fine for single tool calls
  • Model calls wrong tool or wrong params — Enable human-in-the-loop (/hil) to catch and correct bad tool calls
Combine com: filesystem

Connect to multiple MCP servers for a cross-tool workflow

👤 Power users running several MCP servers ⏱ ~20 min intermediate

Quando usar: You want a local LLM to orchestrate across filesystem, GitHub, and other MCP servers in one session.

Pré-requisitos
  • MCP server config JSON — Create a JSON file with all your server definitions (same format as Claude Desktop config)
Fluxo
  1. Launch with config
    ollmcp --servers-json ~/mcp-servers.json --model qwen2.5:7b✓ Copiado
    → All servers connected, tools listed
  2. Use /tools to manage
    Type /tools to see all available tools across servers. Disable any you don't need.✓ Copiado
    → Tool list with enable/disable toggles
  3. Run a cross-server task
    Read my project's README.md, then search GitHub for similar projects and compare their approaches.✓ Copiado
    → Model chains filesystem read + GitHub search

Resultado: A local LLM orchestrating tools from multiple MCP servers in one conversation.

Armadilhas
  • Too many tools confuse smaller models — Disable unused tools with /tools — fewer tools means better tool selection
Combine com: github

Use human-in-the-loop to safely test MCP servers

👤 Developers building or testing new MCP servers ⏱ ~15 min beginner

Quando usar: You're developing an MCP server and want to test tool calls with approval before execution.

Fluxo
  1. Enable HIL mode
    ollmcp -s ./my-mcp-server.py --model llama3.2:3b then type /hil to enable✓ Copiado
    → Human-in-the-loop enabled
  2. Trigger tool calls
    Ask the model to use your server's tools. Each call pauses for your approval.✓ Copiado
    → Tool call preview shown, waiting for y/n
  3. Review and iterate
    Reject bad calls, approve good ones. Check /show-metrics for timing.✓ Copiado
    → Approved calls execute; rejected ones prompt retry

Resultado: A safe testing loop for MCP server development with full visibility into every tool call.

Combinações

Combine com outros MCPs para 10× de alavancagem

client-for-ollama + filesystem

Local Ollama model reads and edits files via filesystem MCP — fully offline coding assistant

ollmcp --auto-discovery --model qwen2.5:7b then: 'Read src/main.py and add error handling to the parse function.'✓ Copiado
client-for-ollama + github

Use a local model to search GitHub repos and create issues without cloud API costs

Connect GitHub MCP to ollmcp, then: 'Search our org for repos with no CI config and create tracking issues.'✓ Copiado

Custo e limites

O que custa rodar

Cota de API
No external API quota — runs on local Ollama. MCP server limits apply per-server.
Tokens por chamada
Depends on Ollama model context window (typically 2k-128k)
Monetário
Completely free. Ollama is free. ollmcp is MIT-licensed. You only pay for electricity.
Dica
This is the zero-cost option for MCP tool-use. Use smaller models (3B) for simple tasks, larger (7B+) for agent mode.

Segurança

Permissões, segredos, alcance

Armazenamento de credenciais: MCP server credentials configured in the servers JSON file or environment variables — same as Claude Desktop config
Saída de dados: Ollama runs locally. MCP servers egress to their own backends. No data leaves your machine by default unless an MCP server sends it.

Solução de problemas

Erros comuns e correções

Connection refused to Ollama

Ensure Ollama is running: ollama serve. Default port is 11434. Use --host to specify a custom host if needed.

Verificar: curl http://localhost:11434/api/tags
Model doesn't call tools / ignores MCP

Not all models support tool-use well. Use qwen2.5:7b or llama3.1:8b which have good tool-use training. Avoid tiny models for complex chains.

Verificar: Test with a simple single-tool prompt first
MCP server fails to connect

Check your servers JSON config. Ensure the command path is absolute and the server binary/script exists. Run the server command manually to see errors.

Verificar: Run the MCP server command directly in a terminal
Auto-discovery finds no servers

ollmcp looks for Claude Desktop config at the standard path. On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

Verificar: ls ~/Library/Application\ Support/Claude/claude_desktop_config.json

Alternativas

mcp-client-for-ollama vs. outros

AlternativaQuando usarTroca
Claude DesktopYou want the best MCP experience and are willing to pay for Claude APICloud-based, costs money, but much better tool-use than local models
otermYou want an Ollama TUI without MCP focusNo MCP integration; just chat

Mais

Recursos

📖 Leia o README oficial no GitHub

🐙 Ver issues abertas

🔍 Ver todos os 400+ servidores MCP e Skills