/ Каталог / Песочница / mcp-client-for-ollama
● Сообщество jonigl ⚡ Сразу

mcp-client-for-ollama

автор jonigl · jonigl/mcp-client-for-ollama

TUI client that connects local Ollama models to any MCP server — agent mode, multi-server, human-in-the-loop, all from your terminal.

mcp-client-for-ollama (ollmcp) is a terminal UI that bridges Ollama's local LLMs with MCP servers. It supports agent mode (iterative tool execution), multi-server connections (STDIO/SSE/HTTP), human-in-the-loop approval, model switching, thinking mode, and performance metrics. For developers who want MCP tool-use without paying for cloud APIs.

Зачем использовать

Ключевые функции

Живое демо

Как выглядит на практике

client-for-ollama.replay ▶ готово
0/0

Установка

Выберите клиент

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Откройте Claude Desktop → Settings → Developer → Edit Config. Перезапустите после сохранения.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Cursor использует ту же схему mcpServers, что и Claude Desktop. Конфиг проекта приоритетнее глобального.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Щёлкните значок MCP Servers на боковой панели Cline, затем "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "client-for-ollama": {
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ],
      "_inferred": true
    }
  }
}

Тот же формат, что и Claude Desktop. Перезапустите Windsurf для применения.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "client-for-ollama",
      "command": "uvx",
      "args": [
        "mcp-client-for-ollama"
      ]
    }
  ]
}

Continue использует массив объектов серверов, а не map.

~/.config/zed/settings.json
{
  "context_servers": {
    "client-for-ollama": {
      "command": {
        "path": "uvx",
        "args": [
          "mcp-client-for-ollama"
        ]
      }
    }
  }
}

Добавьте в context_servers. Zed перезагружается автоматически.

claude mcp add client-for-ollama -- uvx mcp-client-for-ollama

Однострочная команда. Проверить: claude mcp list. Удалить: claude mcp remove.

Сценарии использования

Реальные сценарии: mcp-client-for-ollama

How to use MCP tools with local LLMs for free

👤 Developers who want MCP functionality without cloud API costs ⏱ ~15 min beginner

Когда использовать: You have MCP servers configured but want to use them with a local model instead of Claude or GPT.

Предварительные требования
  • Ollama installed and running — ollama.com — install, then ollama pull llama3.2:3b
  • ollmcp installed — pip install ollmcp
Поток
  1. Launch with auto-discovery
    ollmcp --auto-discovery --model llama3.2:3b✓ Скопировано
    → TUI launches, shows discovered MCP servers from Claude config
  2. Test a tool call
    List the files in my current directory.✓ Скопировано
    → Model calls the filesystem MCP tool and returns results
  3. Enable agent mode for multi-step tasks
    Type /agent to enable agent mode, then: 'Find all TODO comments in this project and summarize them.'✓ Скопировано
    → Model iterates: searches files, reads matches, produces summary

Итог: Working MCP tool-use powered entirely by a local model — zero API cost.

Подводные камни
  • Small models (3B) struggle with complex tool-use chains — Use 7B+ models for agent mode; 3B is fine for single tool calls
  • Model calls wrong tool or wrong params — Enable human-in-the-loop (/hil) to catch and correct bad tool calls
Сочетать с: filesystem

Connect to multiple MCP servers for a cross-tool workflow

👤 Power users running several MCP servers ⏱ ~20 min intermediate

Когда использовать: You want a local LLM to orchestrate across filesystem, GitHub, and other MCP servers in one session.

Предварительные требования
  • MCP server config JSON — Create a JSON file with all your server definitions (same format as Claude Desktop config)
Поток
  1. Launch with config
    ollmcp --servers-json ~/mcp-servers.json --model qwen2.5:7b✓ Скопировано
    → All servers connected, tools listed
  2. Use /tools to manage
    Type /tools to see all available tools across servers. Disable any you don't need.✓ Скопировано
    → Tool list with enable/disable toggles
  3. Run a cross-server task
    Read my project's README.md, then search GitHub for similar projects and compare their approaches.✓ Скопировано
    → Model chains filesystem read + GitHub search

Итог: A local LLM orchestrating tools from multiple MCP servers in one conversation.

Подводные камни
  • Too many tools confuse smaller models — Disable unused tools with /tools — fewer tools means better tool selection
Сочетать с: github

Use human-in-the-loop to safely test MCP servers

👤 Developers building or testing new MCP servers ⏱ ~15 min beginner

Когда использовать: You're developing an MCP server and want to test tool calls with approval before execution.

Поток
  1. Enable HIL mode
    ollmcp -s ./my-mcp-server.py --model llama3.2:3b then type /hil to enable✓ Скопировано
    → Human-in-the-loop enabled
  2. Trigger tool calls
    Ask the model to use your server's tools. Each call pauses for your approval.✓ Скопировано
    → Tool call preview shown, waiting for y/n
  3. Review and iterate
    Reject bad calls, approve good ones. Check /show-metrics for timing.✓ Скопировано
    → Approved calls execute; rejected ones prompt retry

Итог: A safe testing loop for MCP server development with full visibility into every tool call.

Комбинации

Сочетайте с другими MCP — эффект x10

client-for-ollama + filesystem

Local Ollama model reads and edits files via filesystem MCP — fully offline coding assistant

ollmcp --auto-discovery --model qwen2.5:7b then: 'Read src/main.py and add error handling to the parse function.'✓ Скопировано
client-for-ollama + github

Use a local model to search GitHub repos and create issues without cloud API costs

Connect GitHub MCP to ollmcp, then: 'Search our org for repos with no CI config and create tracking issues.'✓ Скопировано

Стоимость и лимиты

Во что обходится

Квота API
No external API quota — runs on local Ollama. MCP server limits apply per-server.
Токенов на вызов
Depends on Ollama model context window (typically 2k-128k)
Деньги
Completely free. Ollama is free. ollmcp is MIT-licensed. You only pay for electricity.
Совет
This is the zero-cost option for MCP tool-use. Use smaller models (3B) for simple tasks, larger (7B+) for agent mode.

Безопасность

Права, секреты, радиус поражения

Хранение учётных данных: MCP server credentials configured in the servers JSON file or environment variables — same as Claude Desktop config
Исходящий трафик: Ollama runs locally. MCP servers egress to their own backends. No data leaves your machine by default unless an MCP server sends it.

Устранение неполадок

Частые ошибки и исправления

Connection refused to Ollama

Ensure Ollama is running: ollama serve. Default port is 11434. Use --host to specify a custom host if needed.

Проверить: curl http://localhost:11434/api/tags
Model doesn't call tools / ignores MCP

Not all models support tool-use well. Use qwen2.5:7b or llama3.1:8b which have good tool-use training. Avoid tiny models for complex chains.

Проверить: Test with a simple single-tool prompt first
MCP server fails to connect

Check your servers JSON config. Ensure the command path is absolute and the server binary/script exists. Run the server command manually to see errors.

Проверить: Run the MCP server command directly in a terminal
Auto-discovery finds no servers

ollmcp looks for Claude Desktop config at the standard path. On macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

Проверить: ls ~/Library/Application\ Support/Claude/claude_desktop_config.json

Альтернативы

mcp-client-for-ollama в сравнении

АльтернативаКогда использоватьКомпромисс
Claude DesktopYou want the best MCP experience and are willing to pay for Claude APICloud-based, costs money, but much better tool-use than local models
otermYou want an Ollama TUI without MCP focusNo MCP integration; just chat

Ещё

Ресурсы

📖 Читать официальный README на GitHub

🐙 Открытые задачи

🔍 Все 400+ MCP-серверов и Skills