/ Annuaire / Playground / mcp-client-for-ollama
● Communauté jonigl ⚡ Instantané

mcp-client-for-ollama

par jonigl · jonigl/mcp-client-for-ollama

TUI client that connects local Ollama models to any MCP server — agent mode, multi-server, human-in-the-loop, all from your terminal.

mcp-client-for-ollama (ollmcp) is a terminal UI that bridges Ollama's local LLMs with MCP servers. It supports agent mode (iterative tool execution), multi-server connections (STDIO/SSE/HTTP), human-in-the-loop approval, model switching, thinking mode, and performance metrics. For developers who want MCP tool-use without paying for cloud APIs.

Pourquoi l'utiliser

Fonctionnalités clés

Démo en direct

Aperçu en pratique

client-for-ollama.replay ▶ prêt
0/0

Installer

Choisissez votre client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Ouvrez Claude Desktop → Settings → Developer → Edit Config. Redémarrez après avoir enregistré.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Cursor utilise le même schéma mcpServers que Claude Desktop. La config projet l'emporte sur la globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Cliquez sur l'icône MCP Servers dans la barre latérale Cline, puis "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Même format que Claude Desktop. Redémarrez Windsurf pour appliquer.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "client-for-ollama",
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ]
    }
  ]
}

Continue utilise un tableau d'objets serveur plutôt qu'une map.

~/.config/zed/settings.json
{
  "context_servers": {
    "client-for-ollama": {
      "command": {
        "path": "uvx",
        "args": [
          "mcp-client-for-ollama"
        ]
      }
    }
  }
}

Ajoutez dans context_servers. Zed recharge à chaud à la sauvegarde.

claude mcp add client-for-ollama -- uvx mcp-client-for-ollama

Une seule ligne. Vérifiez avec claude mcp list. Supprimez avec claude mcp remove.

Cas d'usage

Usages concrets : mcp-client-for-ollama

How to use MCP tools with local LLMs for free

👤 Developers who want MCP functionality without cloud API costs ⏱ ~15 min beginner

Quand l'utiliser : You have MCP servers configured but want to use them with a local model instead of Claude or GPT.

Prérequis
  • Ollama installed and running — ollama.com — install, then ollama pull llama3.2:3b
  • ollmcp installed — pip install ollmcp
Déroulement
  1. Launch with auto-discovery
    ollmcp --auto-discovery --model llama3.2:3b✓ Copié
    → TUI launches, shows discovered MCP servers from Claude config
  2. Test a tool call
    List the files in my current directory.✓ Copié
    → Model calls the filesystem MCP tool and returns results
  3. Enable agent mode for multi-step tasks
    Type /agent to enable agent mode, then: 'Find all TODO comments in this project and summarize them.'✓ Copié
    → Model iterates: searches files, reads matches, produces summary

Résultat : Working MCP tool-use powered entirely by a local model — zero API cost.

Pièges
  • Small models (3B) struggle with complex tool-use chains — Use 7B+ models for agent mode; 3B is fine for single tool calls
  • Model calls wrong tool or wrong params — Enable human-in-the-loop (/hil) to catch and correct bad tool calls
Combiner avec : filesystem

Connect to multiple MCP servers for a cross-tool workflow

👤 Power users running several MCP servers ⏱ ~20 min intermediate

Quand l'utiliser : You want a local LLM to orchestrate across filesystem, GitHub, and other MCP servers in one session.

Prérequis
  • MCP server config JSON — Create a JSON file with all your server definitions (same format as Claude Desktop config)
Déroulement
  1. Launch with config
    ollmcp --servers-json ~/mcp-servers.json --model qwen2.5:7b✓ Copié
    → All servers connected, tools listed
  2. Use /tools to manage
    Type /tools to see all available tools across servers. Disable any you don't need.✓ Copié
    → Tool list with enable/disable toggles
  3. Run a cross-server task
    Read my project's README.md, then search GitHub for similar projects and compare their approaches.✓ Copié
    → Model chains filesystem read + GitHub search

Résultat : A local LLM orchestrating tools from multiple MCP servers in one conversation.

Pièges
  • Too many tools confuse smaller models — Disable unused tools with /tools — fewer tools means better tool selection
Combiner avec : github

Use human-in-the-loop to safely test MCP servers

👤 Developers building or testing new MCP servers ⏱ ~15 min beginner

Quand l'utiliser : You're developing an MCP server and want to test tool calls with approval before execution.

Déroulement
  1. Enable HIL mode
    ollmcp -s ./my-mcp-server.py --model llama3.2:3b then type /hil to enable✓ Copié
    → Human-in-the-loop enabled
  2. Trigger tool calls
    Ask the model to use your server's tools. Each call pauses for your approval.✓ Copié
    → Tool call preview shown, waiting for y/n
  3. Review and iterate
    Reject bad calls, approve good ones. Check /show-metrics for timing.✓ Copié
    → Approved calls execute; rejected ones prompt retry

Résultat : A safe testing loop for MCP server development with full visibility into every tool call.

Combinaisons

Associez-le à d'autres MCPs pour un effet X10

client-for-ollama + filesystem

Local Ollama model reads and edits files via filesystem MCP — fully offline coding assistant

ollmcp --auto-discovery --model qwen2.5:7b then: 'Read src/main.py and add error handling to the parse function.'✓ Copié
client-for-ollama + github

Use a local model to search GitHub repos and create issues without cloud API costs

Connect GitHub MCP to ollmcp, then: 'Search our org for repos with no CI config and create tracking issues.'✓ Copié

Coût et limites

Coût d'exécution

Quota d'API
No external API quota — runs on local Ollama. MCP server limits apply per-server.
Tokens par appel
Depends on Ollama model context window (typically 2k-128k)
Monétaire
Completely free. Ollama is free. ollmcp is MIT-licensed. You only pay for electricity.
Astuce
This is the zero-cost option for MCP tool-use. Use smaller models (3B) for simple tasks, larger (7B+) for agent mode.

Sécurité

Permissions, secrets, portée

Stockage des identifiants : MCP server credentials configured in the servers JSON file or environment variables — same as Claude Desktop config
Sortie de données : Ollama runs locally. MCP servers egress to their own backends. No data leaves your machine by default unless an MCP server sends it.

Dépannage

Erreurs courantes et correctifs

Connection refused to Ollama

Ensure Ollama is running: ollama serve. Default port is 11434. Use --host to specify a custom host if needed.

Vérifier : curl http://localhost:11434/api/tags
Model doesn't call tools / ignores MCP

Not all models support tool-use well. Use qwen2.5:7b or llama3.1:8b which have good tool-use training. Avoid tiny models for complex chains.

Vérifier : Test with a simple single-tool prompt first
MCP server fails to connect

Check your servers JSON config. Ensure the command path is absolute and the server binary/script exists. Run the server command manually to see errors.

Vérifier : Run the MCP server command directly in a terminal
Auto-discovery finds no servers

ollmcp looks for Claude Desktop config at the standard path. On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

Vérifier : ls ~/Library/Application\ Support/Claude/claude_desktop_config.json

Alternatives

mcp-client-for-ollama vs autres

AlternativeQuand l'utiliserCompromis
Claude DesktopYou want the best MCP experience and are willing to pay for Claude APICloud-based, costs money, but much better tool-use than local models
otermYou want an Ollama TUI without MCP focusNo MCP integration; just chat

Plus

Ressources

📖 Lire le README officiel sur GitHub

🐙 Voir les issues ouvertes

🔍 Parcourir les 400+ serveurs MCP et Skills