/ Annuaire / Playground / mcp-agent-langchainjs
● Officiel Azure-Samples ⚡ Instantané

mcp-agent-langchainjs

par Azure-Samples · Azure-Samples/mcp-agent-langchainjs

Azure's official reference — a serverless LangChain.js agent that uses MCP to call a burger-ordering tool API, fully deployable via azd up.

This is an Azure Samples reference app, not an end-user MCP. It shows how to build a serverless LangChain.js agent that integrates MCP for tool calls, deployed to Azure Static Web Apps + Functions + Cosmos DB. The demo is a burger restaurant — but the pattern applies to any tool-using agent you want on Azure.

Pourquoi l'utiliser

Fonctionnalités clés

Démo en direct

Aperçu en pratique

agent-langchainjs.replay ▶ prêt
0/0

Installer

Choisissez votre client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Ouvrez Claude Desktop → Settings → Developer → Edit Config. Redémarrez après avoir enregistré.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Cursor utilise le même schéma mcpServers que Claude Desktop. La config projet l'emporte sur la globale.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Cliquez sur l'icône MCP Servers dans la barre latérale Cline, puis "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Même format que Claude Desktop. Redémarrez Windsurf pour appliquer.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "agent-langchainjs",
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ]
    }
  ]
}

Continue utilise un tableau d'objets serveur plutôt qu'une map.

~/.config/zed/settings.json
{
  "context_servers": {
    "agent-langchainjs": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "mcp-agent-langchainjs"
        ]
      }
    }
  }
}

Ajoutez dans context_servers. Zed recharge à chaud à la sauvegarde.

claude mcp add agent-langchainjs -- npx -y mcp-agent-langchainjs

Une seule ligne. Vérifiez avec claude mcp list. Supprimez avec claude mcp remove.

Cas d'usage

Usages concrets : mcp-agent-langchainjs

Bootstrap a serverless agent on Azure with MCP tool calls

👤 Azure devs building AI features ⏱ ~120 min advanced

Quand l'utiliser : You want to ship an LLM-powered feature on Azure and need a working reference to fork.

Prérequis
  • Azure subscription — azure.microsoft.com — free tier covers dev
  • Azure Developer CLIbrew install azd or Windows installer
Déroulement
  1. Fork and deploy
    Fork Azure-Samples/mcp-agent-langchainjs and walk me through azd up to deploy to my Azure sub.✓ Copié
    → Live Azure URL + Functions + Cosmos provisioned
  2. Swap the demo tool
    Replace the burger-ordering MCP with a custom MCP for my domain (e.g. appointment booking). Show me the wiring.✓ Copié
    → Code diff + working custom tool
  3. Customize the UI
    The sample has a chat UI; customize brand/colors and the welcome message.✓ Copié
    → Styled app

Résultat : A shippable Azure-hosted agent derived from a vetted sample.

Pièges
  • Free-tier Azure OpenAI has low quota — Provision your own OpenAI resource in a region with capacity; set endpoint in env
  • Local Ollama doesn't handle complex tool calls well — Use a cloud model (GPT-4o-mini, etc.) for dev that involves multi-step tool calls

Learn the MCP + LangChain.js integration pattern

👤 Devs new to MCP ⏱ ~60 min intermediate

Quand l'utiliser : You're evaluating MCP and want to see how it plugs into the LangChain.js ecosystem.

Déroulement
  1. Read the code
    Summarize how this repo wires MCP to LangChain.js agents. What's the key integration point?✓ Copié
    → Architecture explanation
  2. Run locally
    Run it in Codespaces. Exercise the burger-order flow. Observe the MCP tool calls in logs.✓ Copié
    → Working local run + tool call traces

Résultat : Hands-on understanding of the pattern before building your own.

Combinaisons

Associez-le à d'autres MCPs pour un effet X10

agent-langchainjs + github

CI/CD the sample to your own fork

Fork the repo, set up GitHub Actions to azd deploy on push to main.✓ Copié

Outils

Ce que ce MCP expose

OutilEntréesQuand appelerCoût
(reference app — not a callable MCP) N/A This is a sample app you deploy, not a tool Claude calls N/A

Coût et limites

Coût d'exécution

Quota d'API
Azure consumption-based
Tokens par appel
N/A — you're building the app, not calling it as a tool
Monétaire
Varies — cheap on free tier for dev; production costs depend on traffic
Astuce
Use Azure cost alerts early. Cosmos DB can be expensive if mis-provisioned — keep it on serverless tier during dev.

Sécurité

Permissions, secrets, portée

Stockage des identifiants : Azure Key Vault + Managed Identity (set up by the Bicep templates)
Sortie de données : Entirely within your Azure sub + chosen LLM endpoint

Dépannage

Erreurs courantes et correctifs

azd up fails: no capacity in region

OpenAI capacity varies by region. Try eastus2, swedencentral, or francecentral.

Functions cold-start slowness

Use Premium plan for prod; Consumption is fine for dev but cold-starts stall early chats.

MCP tool call not recognized

Confirm the LangChain.js tool binding is using the MCP client the sample sets up. Check the imports.

Alternatives

mcp-agent-langchainjs vs autres

AlternativeQuand l'utiliserCompromis
Vercel AI SDK starterYou prefer Vercel / Next.js hostingDifferent cloud; smaller sample
AWS Bedrock Agents + sampleYou're on AWSDifferent stack; Bedrock agents aren't MCP-native

Plus

Ressources

📖 Lire le README officiel sur GitHub

🐙 Voir les issues ouvertes

🔍 Parcourir les 400+ serveurs MCP et Skills