/ Verzeichnis / Playground / mcp-local-rag
● Community shinpr ⚡ Sofort

mcp-local-rag

von shinpr · shinpr/mcp-local-rag

Private, local-first RAG — index your PDFs, docs, and code once, then search semantically from any MCP client. No API keys, no cloud, no data leaving your machine.

mcp-local-rag runs entirely offline after a ~90MB model download. Ingest PDF/DOCX/TXT/MD/HTML files or raw HTML strings, then query with combined semantic + keyword search. Ideal for personal knowledge bases, confidential documents, and working on flights.

Warum nutzen

Hauptfunktionen

Live-Demo

In der Praxis

local-rag.replay ▶ bereit
0/0

Installieren

Wählen Sie Ihren Client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Öffne Claude Desktop → Settings → Developer → Edit Config. Nach dem Speichern neu starten.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Cursor nutzt das gleiche mcpServers-Schema wie Claude Desktop. Projektkonfiguration schlägt die globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Klicken Sie auf das MCP-Servers-Symbol in der Cline-Seitenleiste, dann "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Gleiche Struktur wie Claude Desktop. Windsurf neu starten zum Übernehmen.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "local-rag",
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ]
    }
  ]
}

Continue nutzt ein Array von Serverobjekten statt einer Map.

~/.config/zed/settings.json
{
  "context_servers": {
    "local-rag": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "mcp-local-rag"
        ]
      }
    }
  }
}

In context_servers hinzufügen. Zed lädt beim Speichern neu.

claude mcp add local-rag -- npx -y mcp-local-rag

Einzeiler. Prüfen mit claude mcp list. Entfernen mit claude mcp remove.

Anwendungsfälle

Praxisnahe Nutzung: mcp-local-rag

Build a private RAG over your downloaded research papers and PDFs

👤 Researchers, students, knowledge workers ⏱ ~30 min beginner

Wann einsetzen: You've hoarded hundreds of PDFs in ~/Documents/papers and want to actually use them — 'what did that paper say about attention decay?'

Voraussetzungen
  • PDFs or docs on disk — Any folder of files — recursive ingest supported
Ablauf
  1. Ingest the folder
    Ingest everything under ~/Documents/papers into local-rag. Skip files larger than 50MB.✓ Kopiert
    → Per-file ingest log + 'indexed N files' summary
  2. Ask questions
    Across my papers, what do they say about positional encoding in long-context transformers? Cite the source file and page if possible.✓ Kopiert
    → Synthesized answer with source file citations
  3. Refine search
    Just give me the top 5 passages most relevant to 'ring attention', raw — don't summarize.✓ Kopiert
    → Ranked passage list

Ergebnis: Every paper you've ever downloaded is now queryable by topic — permanent upgrade to your reading life.

Fallstricke
  • Scanned PDFs have no extractable text — Run an OCR pass first (ocrmypdf) before ingesting
  • First index of 1000+ files is slow (CPU embeddings) — Leave it running overnight; incremental re-ingest is fast
Kombinieren mit: filesystem

Query confidential contracts / HR docs without leaking to any cloud

👤 Legal ops, HR, compliance ⏱ ~20 min intermediate

Wann einsetzen: Documents are too sensitive for OpenAI/Claude cloud embeddings. You need search but can't send content anywhere.

Ablauf
  1. Ingest
    Ingest /secure/contracts/*.pdf into local-rag.✓ Kopiert
    → Files indexed locally; confirm no network call was made
  2. Query
    Which contracts have an auto-renewal clause longer than 12 months?✓ Kopiert
    → List of candidate contracts with the clause quoted

Ergebnis: Searchable private corpus with nothing leaving the machine.

Fallstricke
  • Claude answers still go to Anthropic — the embeddings are local but conversation isn't — If answers must also be local, run with a local LLM via Ollama or LM Studio instead of cloud Claude
Kombinieren mit: filesystem

Kombinationen

Mit anderen MCPs für 10-fache Wirkung

local-rag + filesystem

Watch a folder, re-ingest files when they change

Every time a file under ~/Notes changes, re-ingest it into local-rag.✓ Kopiert
local-rag + firecrawl

Scrape a docs site then feed to local-rag for offline querying

Crawl docs.example.com, save each page as Markdown, then ingest all of them into local-rag.✓ Kopiert
local-rag + playwright

Capture JS-rendered pages and ingest their extracted text

Open this SPA, grab the rendered HTML, ingest_data it into local-rag with the URL as source.✓ Kopiert

Werkzeuge

Was dieses MCP bereitstellt

WerkzeugEingabenWann aufrufenKosten
ingest_file path: str | path[] Add one or more files to the index CPU only
ingest_data html: str, source_url?: str Add a raw HTML blob — useful after scraping CPU only
query_documents query: str, top_k?: int Main retrieval call — use before answering user questions free
list_files See what's indexed free
delete_file path: str Remove a stale/irrelevant file from the index free
status Sanity check index size free

Kosten & Limits

Was der Betrieb kostet

API-Kontingent
None — all local
Tokens pro Aufruf
Query results 500-3000 tokens depending on top_k
Kosten in €
Free. One-time ~90MB model download.
Tipp
Set top_k to 5-8 for most questions; going higher wastes tokens without improving answers.

Sicherheit

Rechte, Secrets, Reichweite

Credential-Speicherung: None — no API keys
Datenabfluss: Zero after model download. Your docs never leave the machine.

Fehlerbehebung

Häufige Fehler und Lösungen

First query is slow / seems to hang

Embedding model is downloading on first run (~90MB). Subsequent calls are fast.

Prüfen: Check ~/.cache/mcp-local-rag for the model file
PDF ingest returns 0 chunks

PDF is likely scanned (image-only). Run ocrmypdf input.pdf output.pdf first.

Prüfen: pdftotext input.pdf -
Results feel irrelevant

Pure semantic search struggles with short queries. Add more keywords. The hybrid search boosts them already.

Out of memory on large PDFs

Split the PDF first, or raise Node heap: NODE_OPTIONS=--max-old-space-size=8192

Alternativen

mcp-local-rag vs. andere

AlternativeWann stattdessenKompromiss
Chroma MCP / Qdrant MCPYou want a real vector DB with multi-user, scaling, metadata filtersMore setup, usually requires a running server
OpenAI Assistants file_searchYou're OK sending documents to OpenAI's cloudNot local, costs per token, but zero setup and more accurate
ChatGPT Projects / Claude Projects file uploadSmall document set (<20 files) and you use the hosted chatNot an MCP; can't be scripted

Mehr

Ressourcen

📖 Offizielle README auf GitHub lesen

🐙 Offene Issues ansehen

🔍 Alle 400+ MCP-Server und Skills durchsuchen