/ 디렉터리 / 플레이그라운드 / mcp-local-rag
● 커뮤니티 shinpr ⚡ 바로 사용

mcp-local-rag

제작: shinpr · shinpr/mcp-local-rag

Private, local-first RAG — index your PDFs, docs, and code once, then search semantically from any MCP client. No API keys, no cloud, no data leaving your machine.

mcp-local-rag runs entirely offline after a ~90MB model download. Ingest PDF/DOCX/TXT/MD/HTML files or raw HTML strings, then query with combined semantic + keyword search. Ideal for personal knowledge bases, confidential documents, and working on flights.

왜 쓰나요

핵심 기능

라이브 데모

실제 사용 모습

local-rag.replay ▶ 준비됨
0/0

설치

클라이언트 선택

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Claude Desktop → Settings → Developer → Edit Config 열기. 저장 후 앱 재시작.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Cursor는 Claude Desktop과 동일한 mcpServers 스키마 사용. 프로젝트 설정이 전역보다 우선.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Cline 사이드바의 MCP Servers 아이콘 클릭 후 "Edit Configuration" 선택.

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "local-rag": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ],
      "_inferred": true
    }
  }
}

Claude Desktop과 같은 형식. Windsurf 재시작 후 적용.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "local-rag",
      "command": "npx",
      "args": [
        "-y",
        "mcp-local-rag"
      ]
    }
  ]
}

Continue는 맵이 아닌 서버 오브젝트 배열 사용.

~/.config/zed/settings.json
{
  "context_servers": {
    "local-rag": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "mcp-local-rag"
        ]
      }
    }
  }
}

context_servers에 추가. 저장 시 Zed가 핫 리로드.

claude mcp add local-rag -- npx -y mcp-local-rag

한 줄 명령. claude mcp list로 확인, claude mcp remove로 제거.

사용 사례

실전 활용법: mcp-local-rag

Build a private RAG over your downloaded research papers and PDFs

👤 Researchers, students, knowledge workers ⏱ ~30 min beginner

언제 쓸까: You've hoarded hundreds of PDFs in ~/Documents/papers and want to actually use them — 'what did that paper say about attention decay?'

사전 조건
  • PDFs or docs on disk — Any folder of files — recursive ingest supported
흐름
  1. Ingest the folder
    Ingest everything under ~/Documents/papers into local-rag. Skip files larger than 50MB.✓ 복사됨
    → Per-file ingest log + 'indexed N files' summary
  2. Ask questions
    Across my papers, what do they say about positional encoding in long-context transformers? Cite the source file and page if possible.✓ 복사됨
    → Synthesized answer with source file citations
  3. Refine search
    Just give me the top 5 passages most relevant to 'ring attention', raw — don't summarize.✓ 복사됨
    → Ranked passage list

결과: Every paper you've ever downloaded is now queryable by topic — permanent upgrade to your reading life.

함정
  • Scanned PDFs have no extractable text — Run an OCR pass first (ocrmypdf) before ingesting
  • First index of 1000+ files is slow (CPU embeddings) — Leave it running overnight; incremental re-ingest is fast
함께 쓰기: filesystem

Query confidential contracts / HR docs without leaking to any cloud

👤 Legal ops, HR, compliance ⏱ ~20 min intermediate

언제 쓸까: Documents are too sensitive for OpenAI/Claude cloud embeddings. You need search but can't send content anywhere.

흐름
  1. Ingest
    Ingest /secure/contracts/*.pdf into local-rag.✓ 복사됨
    → Files indexed locally; confirm no network call was made
  2. Query
    Which contracts have an auto-renewal clause longer than 12 months?✓ 복사됨
    → List of candidate contracts with the clause quoted

결과: Searchable private corpus with nothing leaving the machine.

함정
  • Claude answers still go to Anthropic — the embeddings are local but conversation isn't — If answers must also be local, run with a local LLM via Ollama or LM Studio instead of cloud Claude
함께 쓰기: filesystem

조합

다른 MCP와 조합해 10배 효율

local-rag + filesystem

Watch a folder, re-ingest files when they change

Every time a file under ~/Notes changes, re-ingest it into local-rag.✓ 복사됨
local-rag + firecrawl

Scrape a docs site then feed to local-rag for offline querying

Crawl docs.example.com, save each page as Markdown, then ingest all of them into local-rag.✓ 복사됨
local-rag + playwright

Capture JS-rendered pages and ingest their extracted text

Open this SPA, grab the rendered HTML, ingest_data it into local-rag with the URL as source.✓ 복사됨

도구

이 MCP가 노출하는 것

도구입력언제 호출비용
ingest_file path: str | path[] Add one or more files to the index CPU only
ingest_data html: str, source_url?: str Add a raw HTML blob — useful after scraping CPU only
query_documents query: str, top_k?: int Main retrieval call — use before answering user questions free
list_files See what's indexed free
delete_file path: str Remove a stale/irrelevant file from the index free
status Sanity check index size free

비용 및 제한

운영 비용

API 쿼터
None — all local
호출당 토큰
Query results 500-3000 tokens depending on top_k
금액
Free. One-time ~90MB model download.
Set top_k to 5-8 for most questions; going higher wastes tokens without improving answers.

보안

권한, 시크릿, 파급범위

자격 증명 저장: None — no API keys
데이터 외부 송신: Zero after model download. Your docs never leave the machine.

문제 해결

자주 발생하는 오류와 해결

First query is slow / seems to hang

Embedding model is downloading on first run (~90MB). Subsequent calls are fast.

확인: Check ~/.cache/mcp-local-rag for the model file
PDF ingest returns 0 chunks

PDF is likely scanned (image-only). Run ocrmypdf input.pdf output.pdf first.

확인: pdftotext input.pdf -
Results feel irrelevant

Pure semantic search struggles with short queries. Add more keywords. The hybrid search boosts them already.

Out of memory on large PDFs

Split the PDF first, or raise Node heap: NODE_OPTIONS=--max-old-space-size=8192

대안

mcp-local-rag 다른 것과 비교

대안언제 쓰나단점/장점
Chroma MCP / Qdrant MCPYou want a real vector DB with multi-user, scaling, metadata filtersMore setup, usually requires a running server
OpenAI Assistants file_searchYou're OK sending documents to OpenAI's cloudNot local, costs per token, but zero setup and more accurate
ChatGPT Projects / Claude Projects file uploadSmall document set (<20 files) and you use the hosted chatNot an MCP; can't be scripted

더 보기

리소스

📖 GitHub에서 공식 README 읽기

🐙 열린 이슈 보기

🔍 400+ MCP 서버 및 Skills 전체 보기