/ Annuaire / Playground / scira-mcp-chat
● Communauté zaidmukaddam ⚡ Instantané

scira-mcp-chat

par zaidmukaddam · zaidmukaddam/scira-mcp-chat

A minimalist open-source MCP client web app — bring your own LLM, add MCP servers through a UI, and chat.

scira-mcp-chat is a Next.js web chat interface that acts as an MCP client. Stream responses via Vercel AI SDK (multiple providers), add MCP servers (HTTP or SSE) through a settings panel, and get a clean shadcn/ui interface. Pairs well with hosted MCP providers like Composio, Zapier, or Hugging Face.

Pourquoi l'utiliser

Fonctionnalités clés

Démo en direct

Aperçu en pratique

scira-mcp-chat.replay ▶ prêt
0/0

Installer

Choisissez votre client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "scira-mcp-chat": {
      "command": "npx",
      "args": [
        "-y",
        "scira-mcp-chat"
      ],
      "_inferred": true
    }
  }
}

Ouvrez Claude Desktop → Settings → Developer → Edit Config. Redémarrez après avoir enregistré.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "scira-mcp-chat": {
      "command": "npx",
      "args": [
        "-y",
        "scira-mcp-chat"
      ],
      "_inferred": true
    }
  }
}

Cursor utilise le même schéma mcpServers que Claude Desktop. La config projet l'emporte sur la globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "scira-mcp-chat": {
      "command": "npx",
      "args": [
        "-y",
        "scira-mcp-chat"
      ],
      "_inferred": true
    }
  }
}

Cliquez sur l'icône MCP Servers dans la barre latérale Cline, puis "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "scira-mcp-chat": {
      "command": "npx",
      "args": [
        "-y",
        "scira-mcp-chat"
      ],
      "_inferred": true
    }
  }
}

Même format que Claude Desktop. Redémarrez Windsurf pour appliquer.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "scira-mcp-chat",
      "command": "npx",
      "args": [
        "-y",
        "scira-mcp-chat"
      ]
    }
  ]
}

Continue utilise un tableau d'objets serveur plutôt qu'une map.

~/.config/zed/settings.json
{
  "context_servers": {
    "scira-mcp-chat": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "scira-mcp-chat"
        ]
      }
    }
  }
}

Ajoutez dans context_servers. Zed recharge à chaud à la sauvegarde.

claude mcp add scira-mcp-chat -- npx -y scira-mcp-chat

Une seule ligne. Vérifiez avec claude mcp list. Supprimez avec claude mcp remove.

Cas d'usage

Usages concrets : scira-mcp-chat

Self-host an MCP-enabled chat UI for your team

👤 Small teams wanting ChatGPT-like UX with tools ⏱ ~60 min intermediate

Quand l'utiliser : You want internal chat with MCP tools without paying per-seat for Claude Pro / ChatGPT Team.

Prérequis
  • Node 20+ and Postgres — Vercel / Docker / self-hosted
  • LLM API keys — At least one of OpenAI / Anthropic / Google
Déroulement
  1. Clone and deploy
    Walk me through self-hosting scira-mcp-chat on Vercel with Postgres via Supabase.✓ Copié
    → Deployed URL + DB connected
  2. Add MCP servers
    In the settings UI, add Composio's HTTP endpoint for Slack + GitHub.✓ Copié
    → Tools visible in the chat
  3. Share with team
    How do I put this behind basic auth or SSO?✓ Copié
    → Next-Auth or proxy config suggestion

Résultat : A private chat UI with tools, way cheaper than per-seat SaaS.

Pièges
  • Keys stored per-user or globally? Depends on fork — Review auth/settings flow before putting real credentials in

Experiment with many MCP servers without editing config files

👤 MCP tinkerers / reviewers ⏱ ~30 min beginner

Quand l'utiliser : You want to evaluate 10 different MCP servers this weekend without modifying Claude Desktop config repeatedly.

Déroulement
  1. Add servers one at a time
    In the settings UI, paste the HTTP/SSE URL for each MCP server you want to try.✓ Copié
    → Tools list updates live
  2. Test each
    Use tool X with input Y. Report back.✓ Copié
    → Call + response visible

Résultat : Rapid MCP evaluation without restart cycles.

Pièges
  • stdio-only MCPs can't be used directly — Wrap them via a stdio-to-HTTP bridge or run via Smithery which hosts them

Combinaisons

Associez-le à d'autres MCPs pour un effet X10

scira-mcp-chat + filesystem + fetch + github

DIY Claude Desktop replacement

Add filesystem (via HTTP wrapper), fetch, and github MCPs in the settings. Use as daily driver.✓ Copié

Outils

Ce que ce MCP expose

OutilEntréesQuand appelerCoût
(web UI) Natural chat This is the chat UI itself — not an MCP to be called from another client LLM provider cost

Coût et limites

Coût d'exécution

Quota d'API
Your LLM provider's rate limits
Tokens par appel
Pass-through to the model
Monétaire
Self-hosting = just infra cost. LLM = pay-as-you-go. MCP providers (Composio etc.) have their own tiers.
Astuce
Use DeepSeek or Gemini Flash as the default model for team chat — 10-100x cheaper than GPT-4.

Sécurité

Permissions, secrets, portée

Stockage des identifiants : API keys in env vars (self-host) or user settings (multi-user mode)
Sortie de données : Your LLM provider + whichever MCP servers you add
Ne jamais accorder : Unauthenticated public deployment — your LLM keys will get drained

Dépannage

Erreurs courantes et correctifs

Build fails on deploy

Next.js version mismatch — use the Node version in .nvmrc.

Vérifier : node --version
Added MCP server, no tools appear

Transport mismatch. HTTP URL must return the MCP handshake; SSE URL must keep the stream open.

Vérifier : curl the URL and inspect
DB connection errors

DATABASE_URL format: postgres://user:pass@host:5432/db?sslmode=require

Vérifier : psql $DATABASE_URL

Alternatives

scira-mcp-chat vs autres

AlternativeQuand l'utiliserCompromis
LibreChatYou want a more mature, feature-rich chat UIHeavier; MCP support via plugins
Open WebUIYou want native MCP 0.6.31+ support and Ollama integrationMore complex but batteries-included
Claude Desktop / Claude CodeYou only use Anthropic and want zero infraPaid; single-user

Plus

Ressources

📖 Lire le README officiel sur GitHub

🐙 Voir les issues ouvertes

🔍 Parcourir les 400+ serveurs MCP et Skills