Bootstrap a serverless agent on Azure with MCP tool calls
Quand l'utiliser : You want to ship an LLM-powered feature on Azure and need a working reference to fork.
Prérequis
- Azure subscription — azure.microsoft.com — free tier covers dev
- Azure Developer CLI —
brew install azdor Windows installer
Déroulement
-
Fork and deployFork Azure-Samples/mcp-agent-langchainjs and walk me through
azd upto deploy to my Azure sub.✓ Copié→ Live Azure URL + Functions + Cosmos provisioned -
Swap the demo toolReplace the burger-ordering MCP with a custom MCP for my domain (e.g. appointment booking). Show me the wiring.✓ Copié→ Code diff + working custom tool
-
Customize the UIThe sample has a chat UI; customize brand/colors and the welcome message.✓ Copié→ Styled app
Résultat : A shippable Azure-hosted agent derived from a vetted sample.
Pièges
- Free-tier Azure OpenAI has low quota — Provision your own OpenAI resource in a region with capacity; set endpoint in env
- Local Ollama doesn't handle complex tool calls well — Use a cloud model (GPT-4o-mini, etc.) for dev that involves multi-step tool calls