/ Verzeichnis / Playground / mcp-client-for-ollama
● Community jonigl ⚡ Sofort

mcp-client-for-ollama

von jonigl · jonigl/mcp-client-for-ollama

TUI client that connects local Ollama models to any MCP server — agent mode, multi-server, human-in-the-loop, all from your terminal.

mcp-client-for-ollama (ollmcp) is a terminal UI that bridges Ollama's local LLMs with MCP servers. It supports agent mode (iterative tool execution), multi-server connections (STDIO/SSE/HTTP), human-in-the-loop approval, model switching, thinking mode, and performance metrics. For developers who want MCP tool-use without paying for cloud APIs.

Warum nutzen

Hauptfunktionen

Live-Demo

In der Praxis

client-for-ollama.replay ▶ bereit
0/0

Installieren

Wählen Sie Ihren Client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Öffne Claude Desktop → Settings → Developer → Edit Config. Nach dem Speichern neu starten.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Cursor nutzt das gleiche mcpServers-Schema wie Claude Desktop. Projektkonfiguration schlägt die globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Klicken Sie auf das MCP-Servers-Symbol in der Cline-Seitenleiste, dann "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Gleiche Struktur wie Claude Desktop. Windsurf neu starten zum Übernehmen.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "client-for-ollama",
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ]
    }
  ]
}

Continue nutzt ein Array von Serverobjekten statt einer Map.

~/.config/zed/settings.json
{
  "context_servers": {
    "client-for-ollama": {
      "command": {
        "path": "uvx",
        "args": [
          "mcp-client-for-ollama"
        ]
      }
    }
  }
}

In context_servers hinzufügen. Zed lädt beim Speichern neu.

claude mcp add client-for-ollama -- uvx mcp-client-for-ollama

Einzeiler. Prüfen mit claude mcp list. Entfernen mit claude mcp remove.

Anwendungsfälle

Praxisnahe Nutzung: mcp-client-for-ollama

How to use MCP tools with local LLMs for free

👤 Developers who want MCP functionality without cloud API costs ⏱ ~15 min beginner

Wann einsetzen: You have MCP servers configured but want to use them with a local model instead of Claude or GPT.

Voraussetzungen
  • Ollama installed and running — ollama.com — install, then ollama pull llama3.2:3b
  • ollmcp installed — pip install ollmcp
Ablauf
  1. Launch with auto-discovery
    ollmcp --auto-discovery --model llama3.2:3b✓ Kopiert
    → TUI launches, shows discovered MCP servers from Claude config
  2. Test a tool call
    List the files in my current directory.✓ Kopiert
    → Model calls the filesystem MCP tool and returns results
  3. Enable agent mode for multi-step tasks
    Type /agent to enable agent mode, then: 'Find all TODO comments in this project and summarize them.'✓ Kopiert
    → Model iterates: searches files, reads matches, produces summary

Ergebnis: Working MCP tool-use powered entirely by a local model — zero API cost.

Fallstricke
  • Small models (3B) struggle with complex tool-use chains — Use 7B+ models for agent mode; 3B is fine for single tool calls
  • Model calls wrong tool or wrong params — Enable human-in-the-loop (/hil) to catch and correct bad tool calls
Kombinieren mit: filesystem

Connect to multiple MCP servers for a cross-tool workflow

👤 Power users running several MCP servers ⏱ ~20 min intermediate

Wann einsetzen: You want a local LLM to orchestrate across filesystem, GitHub, and other MCP servers in one session.

Voraussetzungen
  • MCP server config JSON — Create a JSON file with all your server definitions (same format as Claude Desktop config)
Ablauf
  1. Launch with config
    ollmcp --servers-json ~/mcp-servers.json --model qwen2.5:7b✓ Kopiert
    → All servers connected, tools listed
  2. Use /tools to manage
    Type /tools to see all available tools across servers. Disable any you don't need.✓ Kopiert
    → Tool list with enable/disable toggles
  3. Run a cross-server task
    Read my project's README.md, then search GitHub for similar projects and compare their approaches.✓ Kopiert
    → Model chains filesystem read + GitHub search

Ergebnis: A local LLM orchestrating tools from multiple MCP servers in one conversation.

Fallstricke
  • Too many tools confuse smaller models — Disable unused tools with /tools — fewer tools means better tool selection
Kombinieren mit: github

Use human-in-the-loop to safely test MCP servers

👤 Developers building or testing new MCP servers ⏱ ~15 min beginner

Wann einsetzen: You're developing an MCP server and want to test tool calls with approval before execution.

Ablauf
  1. Enable HIL mode
    ollmcp -s ./my-mcp-server.py --model llama3.2:3b then type /hil to enable✓ Kopiert
    → Human-in-the-loop enabled
  2. Trigger tool calls
    Ask the model to use your server's tools. Each call pauses for your approval.✓ Kopiert
    → Tool call preview shown, waiting for y/n
  3. Review and iterate
    Reject bad calls, approve good ones. Check /show-metrics for timing.✓ Kopiert
    → Approved calls execute; rejected ones prompt retry

Ergebnis: A safe testing loop for MCP server development with full visibility into every tool call.

Kombinationen

Mit anderen MCPs für 10-fache Wirkung

client-for-ollama + filesystem

Local Ollama model reads and edits files via filesystem MCP — fully offline coding assistant

ollmcp --auto-discovery --model qwen2.5:7b then: 'Read src/main.py and add error handling to the parse function.'✓ Kopiert
client-for-ollama + github

Use a local model to search GitHub repos and create issues without cloud API costs

Connect GitHub MCP to ollmcp, then: 'Search our org for repos with no CI config and create tracking issues.'✓ Kopiert

Kosten & Limits

Was der Betrieb kostet

API-Kontingent
No external API quota — runs on local Ollama. MCP server limits apply per-server.
Tokens pro Aufruf
Depends on Ollama model context window (typically 2k-128k)
Kosten in €
Completely free. Ollama is free. ollmcp is MIT-licensed. You only pay for electricity.
Tipp
This is the zero-cost option for MCP tool-use. Use smaller models (3B) for simple tasks, larger (7B+) for agent mode.

Sicherheit

Rechte, Secrets, Reichweite

Credential-Speicherung: MCP server credentials configured in the servers JSON file or environment variables — same as Claude Desktop config
Datenabfluss: Ollama runs locally. MCP servers egress to their own backends. No data leaves your machine by default unless an MCP server sends it.

Fehlerbehebung

Häufige Fehler und Lösungen

Connection refused to Ollama

Ensure Ollama is running: ollama serve. Default port is 11434. Use --host to specify a custom host if needed.

Prüfen: curl http://localhost:11434/api/tags
Model doesn't call tools / ignores MCP

Not all models support tool-use well. Use qwen2.5:7b or llama3.1:8b which have good tool-use training. Avoid tiny models for complex chains.

Prüfen: Test with a simple single-tool prompt first
MCP server fails to connect

Check your servers JSON config. Ensure the command path is absolute and the server binary/script exists. Run the server command manually to see errors.

Prüfen: Run the MCP server command directly in a terminal
Auto-discovery finds no servers

ollmcp looks for Claude Desktop config at the standard path. On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

Prüfen: ls ~/Library/Application\ Support/Claude/claude_desktop_config.json

Alternativen

mcp-client-for-ollama vs. andere

AlternativeWann stattdessenKompromiss
Claude DesktopYou want the best MCP experience and are willing to pay for Claude APICloud-based, costs money, but much better tool-use than local models
otermYou want an Ollama TUI without MCP focusNo MCP integration; just chat

Mehr

Ressourcen

📖 Offizielle README auf GitHub lesen

🐙 Offene Issues ansehen

🔍 Alle 400+ MCP-Server und Skills durchsuchen