How to use MCP tools with local LLMs for free
Quand l'utiliser : You have MCP servers configured but want to use them with a local model instead of Claude or GPT.
Prérequis
- Ollama installed and running — ollama.com — install, then
ollama pull llama3.2:3b - ollmcp installed — pip install ollmcp
Déroulement
-
Launch with auto-discoveryollmcp --auto-discovery --model llama3.2:3b✓ Copié→ TUI launches, shows discovered MCP servers from Claude config
-
Test a tool callList the files in my current directory.✓ Copié→ Model calls the filesystem MCP tool and returns results
-
Enable agent mode for multi-step tasksType /agent to enable agent mode, then: 'Find all TODO comments in this project and summarize them.'✓ Copié→ Model iterates: searches files, reads matches, produces summary
Résultat : Working MCP tool-use powered entirely by a local model — zero API cost.
Pièges
- Small models (3B) struggle with complex tool-use chains — Use 7B+ models for agent mode; 3B is fine for single tool calls
- Model calls wrong tool or wrong params — Enable human-in-the-loop (/hil) to catch and correct bad tool calls