/ Directory / Playground / mcp-agent-langchainjs
● Official Azure-Samples ⚡ Instant

mcp-agent-langchainjs

by Azure-Samples · Azure-Samples/mcp-agent-langchainjs

Azure's official reference — a serverless LangChain.js agent that uses MCP to call a burger-ordering tool API, fully deployable via azd up.

This is an Azure Samples reference app, not an end-user MCP. It shows how to build a serverless LangChain.js agent that integrates MCP for tool calls, deployed to Azure Static Web Apps + Functions + Cosmos DB. The demo is a burger restaurant — but the pattern applies to any tool-using agent you want on Azure.

Why use it

Key features

Live Demo

What it looks like in practice

agent-langchainjs.replay ▶ ready
0/0

Install

Pick your client

~/Library/Application Support/Claude/claude_desktop_config.json  · Windows: %APPDATA%\Claude\claude_desktop_config.json
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Open Claude Desktop → Settings → Developer → Edit Config. Restart after saving.

~/.cursor/mcp.json · .cursor/mcp.json
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Cursor uses the same mcpServers schema as Claude Desktop. Project config wins over global.

VS Code → Cline → MCP Servers → Edit
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Click the MCP Servers icon in the Cline sidebar, then "Edit Configuration".

~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "agent-langchainjs": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ],
      "_inferred": true
    }
  }
}

Same shape as Claude Desktop. Restart Windsurf to pick up changes.

~/.continue/config.json
{
  "mcpServers": [
    {
      "name": "agent-langchainjs",
      "command": "npx",
      "args": [
        "-y",
        "mcp-agent-langchainjs"
      ]
    }
  ]
}

Continue uses an array of server objects rather than a map.

~/.config/zed/settings.json
{
  "context_servers": {
    "agent-langchainjs": {
      "command": {
        "path": "npx",
        "args": [
          "-y",
          "mcp-agent-langchainjs"
        ]
      }
    }
  }
}

Add to context_servers. Zed hot-reloads on save.

claude mcp add agent-langchainjs -- npx -y mcp-agent-langchainjs

One-liner. Verify with claude mcp list. Remove with claude mcp remove.

Use Cases

Real-world ways to use mcp-agent-langchainjs

Bootstrap a serverless agent on Azure with MCP tool calls

👤 Azure devs building AI features ⏱ ~120 min advanced

When to use: You want to ship an LLM-powered feature on Azure and need a working reference to fork.

Prerequisites
  • Azure subscription — azure.microsoft.com — free tier covers dev
  • Azure Developer CLIbrew install azd or Windows installer
Flow
  1. Fork and deploy
    Fork Azure-Samples/mcp-agent-langchainjs and walk me through azd up to deploy to my Azure sub.✓ Copied
    → Live Azure URL + Functions + Cosmos provisioned
  2. Swap the demo tool
    Replace the burger-ordering MCP with a custom MCP for my domain (e.g. appointment booking). Show me the wiring.✓ Copied
    → Code diff + working custom tool
  3. Customize the UI
    The sample has a chat UI; customize brand/colors and the welcome message.✓ Copied
    → Styled app

Outcome: A shippable Azure-hosted agent derived from a vetted sample.

Pitfalls
  • Free-tier Azure OpenAI has low quota — Provision your own OpenAI resource in a region with capacity; set endpoint in env
  • Local Ollama doesn't handle complex tool calls well — Use a cloud model (GPT-4o-mini, etc.) for dev that involves multi-step tool calls

Learn the MCP + LangChain.js integration pattern

👤 Devs new to MCP ⏱ ~60 min intermediate

When to use: You're evaluating MCP and want to see how it plugs into the LangChain.js ecosystem.

Flow
  1. Read the code
    Summarize how this repo wires MCP to LangChain.js agents. What's the key integration point?✓ Copied
    → Architecture explanation
  2. Run locally
    Run it in Codespaces. Exercise the burger-order flow. Observe the MCP tool calls in logs.✓ Copied
    → Working local run + tool call traces

Outcome: Hands-on understanding of the pattern before building your own.

Combinations

Pair with other MCPs for X10 leverage

agent-langchainjs + github

CI/CD the sample to your own fork

Fork the repo, set up GitHub Actions to azd deploy on push to main.✓ Copied

Tools

What this MCP exposes

ToolInputsWhen to callCost
(reference app — not a callable MCP) N/A This is a sample app you deploy, not a tool Claude calls N/A

Cost & Limits

What this costs to run

API quota
Azure consumption-based
Tokens per call
N/A — you're building the app, not calling it as a tool
Monetary
Varies — cheap on free tier for dev; production costs depend on traffic
Tip
Use Azure cost alerts early. Cosmos DB can be expensive if mis-provisioned — keep it on serverless tier during dev.

Security

Permissions, secrets, blast radius

Credential storage: Azure Key Vault + Managed Identity (set up by the Bicep templates)
Data egress: Entirely within your Azure sub + chosen LLM endpoint

Troubleshooting

Common errors and fixes

azd up fails: no capacity in region

OpenAI capacity varies by region. Try eastus2, swedencentral, or francecentral.

Functions cold-start slowness

Use Premium plan for prod; Consumption is fine for dev but cold-starts stall early chats.

MCP tool call not recognized

Confirm the LangChain.js tool binding is using the MCP client the sample sets up. Check the imports.

Alternatives

mcp-agent-langchainjs vs others

AlternativeWhen to use it insteadTradeoff
Vercel AI SDK starterYou prefer Vercel / Next.js hostingDifferent cloud; smaller sample
AWS Bedrock Agents + sampleYou're on AWSDifferent stack; Bedrock agents aren't MCP-native

More

Resources

📖 Read the official README on GitHub

🐙 Browse open issues

🔍 Browse all 400+ MCP servers and Skills