/ Каталог / Песочница / MARM-Systems
● Сообщество Lyellr88 ⚡ Сразу

MARM-Systems

автор Lyellr88 · Lyellr88/MARM-Systems

Persistent, searchable memory across Claude, Qwen, Gemini, and any MCP client — sessions, notebooks, and semantic recall in one server.

MARM-Systems is a Python MCP server that gives any AI client a persistent memory layer: sessions, notebooks, auto-classifying contextual logs, and semantic vector search. STDIO, HTTP, and WebSocket transports. SQLite backend with WAL mode for concurrent access.

Зачем использовать

Ключевые функции

Живое демо

Как выглядит на практике

marm-systems.replay ▶ готово
0/0

Установка

Выберите клиент

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Откройте Claude Desktop → Settings → Developer → Edit Config. Перезапустите после сохранения.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Cursor использует ту же схему mcpServers, что и Claude Desktop. Конфиг проекта приоритетнее глобального.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Щёлкните значок MCP Servers на боковой панели Cline, затем "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Тот же формат, что и Claude Desktop. Перезапустите Windsurf для применения.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "marm-systems",
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ]
    }
  ]
}

Continue использует массив объектов серверов, а не map.

~/.config/zed/settings.json
{
  "context_servers": {
    "marm-systems": {
      "command": {
        "path": "uvx",
        "args": [
          "MARM-Systems"
        ]
      }
    }
  }
}

Добавьте в context_servers. Zed перезагружается автоматически.

claude mcp add marm-systems -- uvx MARM-Systems

Однострочная команда. Проверить: claude mcp list. Удалить: claude mcp remove.

Сценарии использования

Реальные сценарии: MARM-Systems

Build persistent memory across sessions on a long project

👤 Anyone working a multi-week effort with an AI ⏱ ~15 min beginner

Когда использовать: You keep re-explaining the same project background every session.

Предварительные требования
  • MARM-Systems installed and running — docker pull lyellr88/marm-mcp-server && docker run -d -p 8001:8001 lyellr88/marm-mcp-server
Поток
  1. Start a session tagged to the project
    marm_start with project tag 'dataplatform-migration'. Log that we're migrating from Redshift to Snowflake, deadline end of Q2.✓ Скопировано
    → Session started; initial entry saved
  2. Drop context as you work
    marm_contextual_log: 'Decided to use Fivetran for CDC replication, evaluated Airbyte but config overhead too high.'✓ Скопировано
    → Auto-classified and stored
  3. Next session, recall
    marm_smart_recall 'what did we decide about CDC tooling?'✓ Скопировано
    → Relevant past decision surfaces

Итог: Session N+1 starts with all of session 1-N context accessible by query, not re-type.

Подводные камни
  • Dumping every chat into memory clogs recall — Use marm_contextual_log for decisions and milestones, not every exchange
  • Vector search misses on jargon — Tag entries explicitly with project names for keyword fallback
Сочетать с: drift

Share memory across a team using multiple AI assistants

👤 Teams where different members use Claude / Qwen / Gemini ⏱ ~45 min advanced

Когда использовать: Knowledge trapped in one person's AI shouldn't be.

Поток
  1. Run MARM as a shared HTTP server
    Deploy MARM-Systems on a team server; each teammate points their MCP client at http://marm.team.internal:8001/mcp.✓ Скопировано
    → All clients connect
  2. Log shared context
    marm_contextual_log: 'Database backup runbook is at /runbooks/db-backup.md; last updated 2026-04-10.'✓ Скопировано
    → Everyone can recall it
  3. Recall from any client
    From a teammate's Qwen session: marm_smart_recall 'how do I restore the DB'?✓ Скопировано
    → Same answer surfaces

Итог: Cross-AI shared institutional memory.

Подводные камни
  • Shared memory needs real OAuth — Don't rely on the dev hardcoded credentials — wire to your IdP before prod use
  • Sensitive data leaks into shared recall — Use project tags and scope queries by tag

Use notebooks as AI-queryable scratchpads

👤 Learners, researchers ⏱ ~5 min beginner

Когда использовать: You want a topic-scoped note area the AI can search later.

Поток
  1. Create a notebook
    marm_notebook_add name='rust-ownership' with entry: 'Move vs borrow: move transfers ownership, borrow lets you peek.'✓ Скопировано
    → Notebook created
  2. Add as you learn
    marm_notebook_add 'rust-ownership': 'Mutable borrow is exclusive; only one at a time.'✓ Скопировано
    → Entry appended
  3. Recall later
    marm_notebook_show 'rust-ownership'✓ Скопировано
    → Full notebook

Итог: A queryable learning journal.

Комбинации

Сочетайте с другими MCP — эффект x10

marm-systems + drift

drift remembers code conventions; MARM remembers project decisions

Recall drift conventions AND MARM decisions for project 'dataplatform-migration' and summarize where they interact.✓ Скопировано
marm-systems + claude-code-organizer

Move short-lived memories out of CC into MARM for long-term recall

Organizer listed my 12 biggest memories — move the 5 project-specific ones into MARM notebooks.✓ Скопировано

Инструменты

Что предоставляет этот MCP

ИнструментВходные данныеКогда вызыватьСтоимость
marm_start project?, tags? Start of any working session free (local)
marm_refresh session_id? Reload context mid-session free
marm_smart_recall query: str, top_k?: int Semantic search past entries free (local embeddings)
marm_contextual_log text: str, tags?: str[] Persist a decision/milestone/fact free
marm_log_session session_id Review a past session free
marm_notebook_add name: str, entry: str Topic-scoped notes free
marm_notebook_use name: str Pin a topic for the session free
marm_notebook_show name: str Read a notebook free
marm_summary scope: session|notebook|tag, id Condense long history free (local)
marm_context_bridge from_session, to_session Cross-project reasoning free

Стоимость и лимиты

Во что обходится

Квота API
None — local-only
Токенов на вызов
Recall responses 500-2000 tokens typically
Деньги
Free, open source
Совет
Rotate or archive old sessions periodically — SQLite handles scale but recall quality degrades with noise.

Безопасность

Права, секреты, радиус поражения

Хранение учётных данных: Local SQLite; for shared HTTP mode use env-based OAuth secrets
Исходящий трафик: None by default (local storage); if you run shared, only within your network
Никогда не давайте: Exposing port 8001 to the public internet without an auth layer

Устранение неполадок

Частые ошибки и исправления

Cannot connect to localhost:8001

Container not running. docker ps to check; docker start marm-mcp-server.

Проверить: curl http://localhost:8001/docs
Embeddings model download fails

First run downloads the model; needs outbound net. After that, offline works.

Проверить: Check docker logs for HuggingFace download
Recall returns unrelated entries

Too few entries for good embeddings; add more, or pre-filter by tag.

Проверить: marm_smart_recall with tag filter

Альтернативы

MARM-Systems в сравнении

АльтернативаКогда использоватьКомпромисс
driftYou want code-specific memory (conventions, decisions)Code-focused; less general
Letta (MemGPT)You want a research-grade memory agent, not just a serverHeavier to run; opinionated architecture
CC native memoriesClaude Code only; no cross-client needNo semantic search; bloats context

Ещё

Ресурсы

📖 Читать официальный README на GitHub

🐙 Открытые задачи

🔍 Все 400+ MCP-серверов и Skills