/ Diretório / Playground / MARM-Systems
● Comunidade Lyellr88 ⚡ Instantâneo

MARM-Systems

por Lyellr88 · Lyellr88/MARM-Systems

Persistent, searchable memory across Claude, Qwen, Gemini, and any MCP client — sessions, notebooks, and semantic recall in one server.

MARM-Systems is a Python MCP server that gives any AI client a persistent memory layer: sessions, notebooks, auto-classifying contextual logs, and semantic vector search. STDIO, HTTP, and WebSocket transports. SQLite backend with WAL mode for concurrent access.

Por que usar

Principais recursos

Demo ao vivo

Como fica na prática

marm-systems.replay ▶ pronto
0/0

Instalar

Escolha seu cliente

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Abra Claude Desktop → Settings → Developer → Edit Config. Reinicie após salvar.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Cursor usa o mesmo esquema mcpServers que o Claude Desktop. Config de projeto vence a global.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Clique no ícone MCP Servers na barra lateral do Cline, depois "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Mesmo formato do Claude Desktop. Reinicie o Windsurf para aplicar.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "marm-systems",
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ]
    }
  ]
}

O Continue usa um array de objetos de servidor em vez de um map.

~/.config/zed/settings.json
{
  "context_servers": {
    "marm-systems": {
      "command": {
        "path": "uvx",
        "args": [
          "MARM-Systems"
        ]
      }
    }
  }
}

Adicione em context_servers. Zed recarrega automaticamente ao salvar.

claude mcp add marm-systems -- uvx MARM-Systems

Uma linha só. Verifique com claude mcp list. Remova com claude mcp remove.

Casos de uso

Usos do mundo real: MARM-Systems

Build persistent memory across sessions on a long project

👤 Anyone working a multi-week effort with an AI ⏱ ~15 min beginner

Quando usar: You keep re-explaining the same project background every session.

Pré-requisitos
  • MARM-Systems installed and running — docker pull lyellr88/marm-mcp-server && docker run -d -p 8001:8001 lyellr88/marm-mcp-server
Fluxo
  1. Start a session tagged to the project
    marm_start with project tag 'dataplatform-migration'. Log that we're migrating from Redshift to Snowflake, deadline end of Q2.✓ Copiado
    → Session started; initial entry saved
  2. Drop context as you work
    marm_contextual_log: 'Decided to use Fivetran for CDC replication, evaluated Airbyte but config overhead too high.'✓ Copiado
    → Auto-classified and stored
  3. Next session, recall
    marm_smart_recall 'what did we decide about CDC tooling?'✓ Copiado
    → Relevant past decision surfaces

Resultado: Session N+1 starts with all of session 1-N context accessible by query, not re-type.

Armadilhas
  • Dumping every chat into memory clogs recall — Use marm_contextual_log for decisions and milestones, not every exchange
  • Vector search misses on jargon — Tag entries explicitly with project names for keyword fallback
Combine com: drift

Share memory across a team using multiple AI assistants

👤 Teams where different members use Claude / Qwen / Gemini ⏱ ~45 min advanced

Quando usar: Knowledge trapped in one person's AI shouldn't be.

Fluxo
  1. Run MARM as a shared HTTP server
    Deploy MARM-Systems on a team server; each teammate points their MCP client at http://marm.team.internal:8001/mcp.✓ Copiado
    → All clients connect
  2. Log shared context
    marm_contextual_log: 'Database backup runbook is at /runbooks/db-backup.md; last updated 2026-04-10.'✓ Copiado
    → Everyone can recall it
  3. Recall from any client
    From a teammate's Qwen session: marm_smart_recall 'how do I restore the DB'?✓ Copiado
    → Same answer surfaces

Resultado: Cross-AI shared institutional memory.

Armadilhas
  • Shared memory needs real OAuth — Don't rely on the dev hardcoded credentials — wire to your IdP before prod use
  • Sensitive data leaks into shared recall — Use project tags and scope queries by tag

Use notebooks as AI-queryable scratchpads

👤 Learners, researchers ⏱ ~5 min beginner

Quando usar: You want a topic-scoped note area the AI can search later.

Fluxo
  1. Create a notebook
    marm_notebook_add name='rust-ownership' with entry: 'Move vs borrow: move transfers ownership, borrow lets you peek.'✓ Copiado
    → Notebook created
  2. Add as you learn
    marm_notebook_add 'rust-ownership': 'Mutable borrow is exclusive; only one at a time.'✓ Copiado
    → Entry appended
  3. Recall later
    marm_notebook_show 'rust-ownership'✓ Copiado
    → Full notebook

Resultado: A queryable learning journal.

Combinações

Combine com outros MCPs para 10× de alavancagem

marm-systems + drift

drift remembers code conventions; MARM remembers project decisions

Recall drift conventions AND MARM decisions for project 'dataplatform-migration' and summarize where they interact.✓ Copiado
marm-systems + claude-code-organizer

Move short-lived memories out of CC into MARM for long-term recall

Organizer listed my 12 biggest memories — move the 5 project-specific ones into MARM notebooks.✓ Copiado

Ferramentas

O que este MCP expõe

FerramentaEntradasQuando chamarCusto
marm_start project?, tags? Start of any working session free (local)
marm_refresh session_id? Reload context mid-session free
marm_smart_recall query: str, top_k?: int Semantic search past entries free (local embeddings)
marm_contextual_log text: str, tags?: str[] Persist a decision/milestone/fact free
marm_log_session session_id Review a past session free
marm_notebook_add name: str, entry: str Topic-scoped notes free
marm_notebook_use name: str Pin a topic for the session free
marm_notebook_show name: str Read a notebook free
marm_summary scope: session|notebook|tag, id Condense long history free (local)
marm_context_bridge from_session, to_session Cross-project reasoning free

Custo e limites

O que custa rodar

Cota de API
None — local-only
Tokens por chamada
Recall responses 500-2000 tokens typically
Monetário
Free, open source
Dica
Rotate or archive old sessions periodically — SQLite handles scale but recall quality degrades with noise.

Segurança

Permissões, segredos, alcance

Armazenamento de credenciais: Local SQLite; for shared HTTP mode use env-based OAuth secrets
Saída de dados: None by default (local storage); if you run shared, only within your network
Nunca conceda: Exposing port 8001 to the public internet without an auth layer

Solução de problemas

Erros comuns e correções

Cannot connect to localhost:8001

Container not running. docker ps to check; docker start marm-mcp-server.

Verificar: curl http://localhost:8001/docs
Embeddings model download fails

First run downloads the model; needs outbound net. After that, offline works.

Verificar: Check docker logs for HuggingFace download
Recall returns unrelated entries

Too few entries for good embeddings; add more, or pre-filter by tag.

Verificar: marm_smart_recall with tag filter

Alternativas

MARM-Systems vs. outros

AlternativaQuando usarTroca
driftYou want code-specific memory (conventions, decisions)Code-focused; less general
Letta (MemGPT)You want a research-grade memory agent, not just a serverHeavier to run; opinionated architecture
CC native memoriesClaude Code only; no cross-client needNo semantic search; bloats context

Mais

Recursos

📖 Leia o README oficial no GitHub

🐙 Ver issues abertas

🔍 Ver todos os 400+ servidores MCP e Skills