/ ディレクトリ / プレイグラウンド / swarmvault
● コミュニティ swarmclawai ⚡ 即起動

swarmvault

作者 swarmclawai · swarmclawai/swarmvault

Turn raw research — PDFs, transcripts, code, audio — into a local markdown wiki + knowledge graph your AI can query forever.

SwarmVault compiles mixed inputs (30+ formats) into an Obsidian-compatible wiki with a typed knowledge graph and hybrid search. Exposes an MCP server for Claude Code, Codex, OpenCode, and others. Every edge is tagged extracted/inferred/ambiguous for provenance. Local-first with an offline heuristic provider (no API keys needed).

なぜ使うのか

主な機能

ライブデモ

実際の動作

swarmvault.replay ▶ 準備完了
0/0

インストール

クライアントを選択

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "swarmvault": {
      "command": "npx",
      "args": [
        "-y",
        "swarmvault"
      ],
      "_inferred": true
    }
  }
}

Claude Desktop → Settings → Developer → Edit Config を開く。保存後、アプリを再起動。

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "swarmvault": {
      "command": "npx",
      "args": [
        "-y",
        "swarmvault"
      ],
      "_inferred": true
    }
  }
}

Cursor は Claude Desktop と同じ mcpServers スキーマを使用。プロジェクト設定はグローバルより優先。

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "swarmvault": {
      "command": "npx",
      "args": [
        "-y",
        "swarmvault"
      ],
      "_inferred": true
    }
  }
}

Cline サイドバーの MCP Servers アイコンをクリックし、"Edit Configuration" を選択。

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "swarmvault": {
      "command": "npx",
      "args": [
        "-y",
        "swarmvault"
      ],
      "_inferred": true
    }
  }
}

Claude Desktop と同じ形式。Windsurf を再起動して反映。

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "swarmvault",
      "command": "npx",
      "args": [
        "-y",
        "swarmvault"
      ]
    }
  ]
}

Continue はマップではなくサーバーオブジェクトの配列を使用。

~/.config/zed/settings.json
{
  "context_servers": {
    "swarmvault": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "swarmvault"
        ]
      }
    }
  }
}

context_servers に追加。保存時に Zed がホットリロード。

claude mcp add swarmvault -- npx -y swarmvault

ワンライナー。claude mcp list で確認、claude mcp remove で削除。

ユースケース

実用的な使い方: swarmvault

Compound research across months into a queryable wiki

👤 Researchers, analysts doing deep work over time ⏱ ~60 min intermediate

使うタイミング: You've been researching a topic for weeks and realize your notes are scattered.

前提条件
  • swarmvault CLI — npm install -g @swarmvaultai/cli
  • A vault — swarmvault init --obsidian --profile personal-research
フロー
  1. Dump raw sources
    Drop everything into raw/: PDFs, saved articles, meeting transcripts, code snippets.✓ コピーしました
    → Immutable raw/ folder populated
  2. Compile the wiki
    swarmvault compile — generate wiki/ with typed pages and link frontmatter.✓ コピーしました
    → Wiki pages created
  3. Query through MCP
    Via the MCP: what's the current consensus in my notes about 'post-quantum TLS timeline'? Include contradictions.✓ コピーしました
    → Answer with graph-walked citations and flagged contradictions

結果: A growing, queryable research base that gets smarter with every addition.

注意点
  • Noisy inputs produce noisy graph — Curate raw/ — delete junk instead of ingesting everything
  • Over-relying on the graph for 'truth' — Edges tagged 'inferred' or 'ambiguous' are hypotheses, not facts — respect the tags
組み合わせ: documentation-server

Turn meeting transcripts into structured team knowledge

👤 Teams doing a lot of recorded meetings ⏱ ~45 min intermediate

使うタイミング: You have Fathom/Otter transcripts piling up and want to query across them.

フロー
  1. Drop transcripts in raw/
    Copy all my Otter transcripts from last quarter into raw/meetings/.✓ コピーしました
    → Files in place
  2. Compile
    swarmvault compile. Confirm entities (people, projects) were extracted.✓ コピーしました
    → Entity count report
  3. Query
    What did we decide about the pricing overhaul across all Q1 meetings? Cite each source.✓ コピーしました
    → Consolidated answer with source transcripts cited

結果: Institutional memory that survives people leaving.

Build a book-club wiki with cross-book reasoning

👤 Serious readers ⏱ ~90 min intermediate

使うタイミング: You want the AI to reason across books, not just one at a time.

フロー
  1. Ingest
    Drop PDFs + notes for 5 books into raw/; compile.✓ コピーしました
    → Wiki with one page per book + concept pages
  2. Cross-reason
    Which concepts show up in 3+ books? Identify contradictions between authors.✓ コピーしました
    → Concept map with contradictions flagged

結果: A reading practice that compounds.

組み合わせ

他のMCPと組み合わせて10倍の力を

swarmvault for structured wiki; documentation-server for quick-ingest search

Ingest new papers into both — compare retrieval quality and structure.✓ コピーしました
swarmvault + logseq

Export vault pages into a Logseq graph

Export wiki/ Markdown into my Logseq pages/ dir with frontmatter preserved.✓ コピーしました

ツール

このMCPが提供する機能

ツール入力呼び出すタイミングコスト
vault_search query: str, top_k?: int Main retrieval free (local)
vault_get_page slug: str Full page read free
vault_graph_neighbors slug, depth? Explore concept graph free
vault_contradictions topic?: str Quality check free
vault_compile (none) After adding new raw/ CPU + optional embedding calls
vault_graph_report scope?: str Overview of a topic free

コストと制限

運用コスト

APIクォータ
Zero with the heuristic provider; otherwise your LLM provider's limits
呼び出しあたりのトークン
Compile can stream large source content through embeddings; search queries 500-2000 tokens typical
金額
Free; optional LLM providers cost normal LLM fees
ヒント
Use heuristic provider for first-pass compile; upgrade to LLM-backed compile only when quality matters.

セキュリティ

権限、シークレット、影響範囲

認証情報の保管: Optional LLM provider keys in env/config
データ送信先: None by default (heuristic); opt-in provider calls only

トラブルシューティング

よくあるエラーと対処法

Compile fails on a specific PDF

Some PDFs are scanned/image-only. Pre-OCR with ocrmypdf before dropping in raw/.

確認: pdftotext file.pdf - | head
MCP connection fails after agent install

Re-run swarmvault install --agent <name> and restart the agent; config paths vary per platform.

確認: Client MCP list shows swarmvault
Hybrid search returns nothing

Vault not compiled yet, or embeddings not built for offline mode. Run swarmvault compile.

確認: swarmvault status

代替案

swarmvault 他との比較

代替案代わりに使う場面トレードオフ
documentation-serverYou want quick drag-and-drop RAG without wiki compilationNo structured graph or contradiction detection
Obsidian + Smart ConnectionsYou live in Obsidian and want in-editor AILess structured compile pipeline
NotebookLMYou're OK with Google-hosted, easy UIData leaves your machine; no graph export

その他

リソース

📖 GitHub の公式 README を読む

🐙 オープンな issue を見る

🔍 400以上のMCPサーバーとSkillsを見る