/ Verzeichnis / Playground / MARM-Systems
● Community Lyellr88 ⚡ Sofort

MARM-Systems

von Lyellr88 · Lyellr88/MARM-Systems

Persistent, searchable memory across Claude, Qwen, Gemini, and any MCP client — sessions, notebooks, and semantic recall in one server.

MARM-Systems is a Python MCP server that gives any AI client a persistent memory layer: sessions, notebooks, auto-classifying contextual logs, and semantic vector search. STDIO, HTTP, and WebSocket transports. SQLite backend with WAL mode for concurrent access.

Warum nutzen

Hauptfunktionen

Live-Demo

In der Praxis

marm-systems.replay ▶ bereit
0/0

Installieren

Wählen Sie Ihren Client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Öffne Claude Desktop → Settings → Developer → Edit Config. Nach dem Speichern neu starten.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Cursor nutzt das gleiche mcpServers-Schema wie Claude Desktop. Projektkonfiguration schlägt die globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Klicken Sie auf das MCP-Servers-Symbol in der Cline-Seitenleiste, dann "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "marm-systems": {
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ],
      "_inferred": true
    }
  }
}

Gleiche Struktur wie Claude Desktop. Windsurf neu starten zum Übernehmen.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "marm-systems",
      "command": "uvx",
      "args": [
        "MARM-Systems"
      ]
    }
  ]
}

Continue nutzt ein Array von Serverobjekten statt einer Map.

~/.config/zed/settings.json
{
  "context_servers": {
    "marm-systems": {
      "command": {
        "path": "uvx",
        "args": [
          "MARM-Systems"
        ]
      }
    }
  }
}

In context_servers hinzufügen. Zed lädt beim Speichern neu.

claude mcp add marm-systems -- uvx MARM-Systems

Einzeiler. Prüfen mit claude mcp list. Entfernen mit claude mcp remove.

Anwendungsfälle

Praxisnahe Nutzung: MARM-Systems

Build persistent memory across sessions on a long project

👤 Anyone working a multi-week effort with an AI ⏱ ~15 min beginner

Wann einsetzen: You keep re-explaining the same project background every session.

Voraussetzungen
  • MARM-Systems installed and running — docker pull lyellr88/marm-mcp-server && docker run -d -p 8001:8001 lyellr88/marm-mcp-server
Ablauf
  1. Start a session tagged to the project
    marm_start with project tag 'dataplatform-migration'. Log that we're migrating from Redshift to Snowflake, deadline end of Q2.✓ Kopiert
    → Session started; initial entry saved
  2. Drop context as you work
    marm_contextual_log: 'Decided to use Fivetran for CDC replication, evaluated Airbyte but config overhead too high.'✓ Kopiert
    → Auto-classified and stored
  3. Next session, recall
    marm_smart_recall 'what did we decide about CDC tooling?'✓ Kopiert
    → Relevant past decision surfaces

Ergebnis: Session N+1 starts with all of session 1-N context accessible by query, not re-type.

Fallstricke
  • Dumping every chat into memory clogs recall — Use marm_contextual_log for decisions and milestones, not every exchange
  • Vector search misses on jargon — Tag entries explicitly with project names for keyword fallback
Kombinieren mit: drift

Share memory across a team using multiple AI assistants

👤 Teams where different members use Claude / Qwen / Gemini ⏱ ~45 min advanced

Wann einsetzen: Knowledge trapped in one person's AI shouldn't be.

Ablauf
  1. Run MARM as a shared HTTP server
    Deploy MARM-Systems on a team server; each teammate points their MCP client at http://marm.team.internal:8001/mcp.✓ Kopiert
    → All clients connect
  2. Log shared context
    marm_contextual_log: 'Database backup runbook is at /runbooks/db-backup.md; last updated 2026-04-10.'✓ Kopiert
    → Everyone can recall it
  3. Recall from any client
    From a teammate's Qwen session: marm_smart_recall 'how do I restore the DB'?✓ Kopiert
    → Same answer surfaces

Ergebnis: Cross-AI shared institutional memory.

Fallstricke
  • Shared memory needs real OAuth — Don't rely on the dev hardcoded credentials — wire to your IdP before prod use
  • Sensitive data leaks into shared recall — Use project tags and scope queries by tag

Use notebooks as AI-queryable scratchpads

👤 Learners, researchers ⏱ ~5 min beginner

Wann einsetzen: You want a topic-scoped note area the AI can search later.

Ablauf
  1. Create a notebook
    marm_notebook_add name='rust-ownership' with entry: 'Move vs borrow: move transfers ownership, borrow lets you peek.'✓ Kopiert
    → Notebook created
  2. Add as you learn
    marm_notebook_add 'rust-ownership': 'Mutable borrow is exclusive; only one at a time.'✓ Kopiert
    → Entry appended
  3. Recall later
    marm_notebook_show 'rust-ownership'✓ Kopiert
    → Full notebook

Ergebnis: A queryable learning journal.

Kombinationen

Mit anderen MCPs für 10-fache Wirkung

marm-systems + drift

drift remembers code conventions; MARM remembers project decisions

Recall drift conventions AND MARM decisions for project 'dataplatform-migration' and summarize where they interact.✓ Kopiert
marm-systems + claude-code-organizer

Move short-lived memories out of CC into MARM for long-term recall

Organizer listed my 12 biggest memories — move the 5 project-specific ones into MARM notebooks.✓ Kopiert

Werkzeuge

Was dieses MCP bereitstellt

WerkzeugEingabenWann aufrufenKosten
marm_start project?, tags? Start of any working session free (local)
marm_refresh session_id? Reload context mid-session free
marm_smart_recall query: str, top_k?: int Semantic search past entries free (local embeddings)
marm_contextual_log text: str, tags?: str[] Persist a decision/milestone/fact free
marm_log_session session_id Review a past session free
marm_notebook_add name: str, entry: str Topic-scoped notes free
marm_notebook_use name: str Pin a topic for the session free
marm_notebook_show name: str Read a notebook free
marm_summary scope: session|notebook|tag, id Condense long history free (local)
marm_context_bridge from_session, to_session Cross-project reasoning free

Kosten & Limits

Was der Betrieb kostet

API-Kontingent
None — local-only
Tokens pro Aufruf
Recall responses 500-2000 tokens typically
Kosten in €
Free, open source
Tipp
Rotate or archive old sessions periodically — SQLite handles scale but recall quality degrades with noise.

Sicherheit

Rechte, Secrets, Reichweite

Credential-Speicherung: Local SQLite; for shared HTTP mode use env-based OAuth secrets
Datenabfluss: None by default (local storage); if you run shared, only within your network
Niemals gewähren: Exposing port 8001 to the public internet without an auth layer

Fehlerbehebung

Häufige Fehler und Lösungen

Cannot connect to localhost:8001

Container not running. docker ps to check; docker start marm-mcp-server.

Prüfen: curl http://localhost:8001/docs
Embeddings model download fails

First run downloads the model; needs outbound net. After that, offline works.

Prüfen: Check docker logs for HuggingFace download
Recall returns unrelated entries

Too few entries for good embeddings; add more, or pre-filter by tag.

Prüfen: marm_smart_recall with tag filter

Alternativen

MARM-Systems vs. andere

AlternativeWann stattdessenKompromiss
driftYou want code-specific memory (conventions, decisions)Code-focused; less general
Letta (MemGPT)You want a research-grade memory agent, not just a serverHeavier to run; opinionated architecture
CC native memoriesClaude Code only; no cross-client needNo semantic search; bloats context

Mehr

Ressourcen

📖 Offizielle README auf GitHub lesen

🐙 Offene Issues ansehen

🔍 Alle 400+ MCP-Server und Skills durchsuchen