How to use MCP tools with local LLMs for free
언제 쓸까: You have MCP servers configured but want to use them with a local model instead of Claude or GPT.
사전 조건
- Ollama installed and running — ollama.com — install, then
ollama pull llama3.2:3b - ollmcp installed — pip install ollmcp
흐름
-
Launch with auto-discoveryollmcp --auto-discovery --model llama3.2:3b✓ 복사됨→ TUI launches, shows discovered MCP servers from Claude config
-
Test a tool callList the files in my current directory.✓ 복사됨→ Model calls the filesystem MCP tool and returns results
-
Enable agent mode for multi-step tasksType /agent to enable agent mode, then: 'Find all TODO comments in this project and summarize them.'✓ 복사됨→ Model iterates: searches files, reads matches, produces summary
결과: Working MCP tool-use powered entirely by a local model — zero API cost.
함정
- Small models (3B) struggle with complex tool-use chains — Use 7B+ models for agent mode; 3B is fine for single tool calls
- Model calls wrong tool or wrong params — Enable human-in-the-loop (/hil) to catch and correct bad tool calls