Table of Contents

  1. Introduction
  2. Prerequisites
  3. Project Setup
  4. Configuration
  5. Implementing the Azure OpenAI Service
  6. Implementing the MCP Client Service
  7. Creating the Chat Orchestrator
  8. Building the API Endpoints
  9. Testing and Validation
  10. Troubleshooting
  11. Deployment to Azure
  12. Next Steps

Introduction

This guide provides complete, step-by-step instructions for building an ASP.NET Core application that integrates Azure OpenAI (LLM) services with a Model Context Protocol (MCP) server. The application acts as an intelligent middleware that:

  • Receives user requests via REST API
  • Processes them through Azure’s Large Language Model
  • Leverages MCP tools for enhanced functionality
  • Returns intelligent, context-aware responses

What You’ll Build

By the end of this guide, you’ll have a minimal but fully functional ASP.NET Core application that:

  • Accepts chat requests from clients
  • Uses Azure OpenAI to understand and process natural language
  • Connects to a remote MCP server (hosted at https://your-mcp-server.azurewebsites.net)
  • Executes MCP tools when needed (e.g., getting time in different timezones)
  • Returns formatted responses to clients

To learn how to build your own MCP server with ASP.NET Core, check out this comprehensive guide.

Architecture Overview

( R C E l S i T e n A t P I ) A ( S M M ( P C C R . P P e N m E C S o T l e t i r e C e v / o n e A r t r z e u + r A e p L ) p L M ) T E o x A S o e z e l c u r u r v C t e i a i c l o O e l n p e n A I

Prerequisites

Before you begin, ensure you have the following:

Development Environment

  1. .NET 9.0 SDK or later

  2. Code Editor

    • Visual Studio 2022 (recommended)
    • Visual Studio Code with C# extension
    • JetBrains Rider
  3. HTTP Client for Testing

    • Visual Studio’s built-in .http file support
    • VS Code’s REST Client extension
    • Postman or similar tool

Azure Environment

  1. Azure Subscription

    • An active Azure subscription
    • Access to create Azure OpenAI resources
  2. Azure OpenAI Service

    • Azure OpenAI resource created in your subscription
    • A deployed model (e.g., gpt-4 or gpt-35-turbo)
    • Service endpoint URL (format: https://your-resource.openai.azure.com)
    • API key or configured Managed Identity
  3. Required Information

    • Azure OpenAI endpoint URL
    • Azure OpenAI API key
    • Deployment name (the name you gave your GPT model deployment)

MCP Server Access

  • MCP Server URL: https://your-mcp-server.azurewebsites.net (replace with your own MCP server)
  • This server provides time-related tools
  • For this tutorial, we assume the server is publicly accessible (no authentication required)
  • Learn how to build your own MCP server: Building MCP Server with ASP.NET Core

Knowledge Requirements

  • Basic understanding of ASP.NET Core Web APIs
  • Familiarity with dependency injection in .NET
  • Understanding of async/await patterns in C#
  • Basic knowledge of REST APIs and HTTP protocols
  • Azure CLI: For deployment tasks
  • Git: For version control
  • Application Insights: For monitoring (can be configured later)

Project Setup

Step 1: Create a New ASP.NET Core Web API Project

Open your terminal or command prompt and run:

dotnet new webapi -n McpClient_ManualIntegration
cd McpClient_ManualIntegration

This creates a new ASP.NET Core Web API project named McpClient_ManualIntegration.

Step 2: Install Required NuGet Packages

Install the Azure OpenAI SDK and HTTP client packages:

dotnet add package Azure.AI.OpenAI --version 2.5.0-beta.1
dotnet add package Microsoft.Extensions.Http

Why these packages?

  • Azure.AI.OpenAI: Official SDK for interacting with Azure OpenAI services
  • Microsoft.Extensions.Http: Enhanced HTTP client with dependency injection support

Step 3: Verify Your Project Structure

Your project should now have this structure:

M c p C l C P a a M W i o r p p c e e n o p p p a n t g s s C t t r W r e e l h _ o e a t t i e M l a m t t e r a l t . i i n F n e h c n n t o u r e s g g _ r a s r s s M e l / F . . a c I o j D n a n r s e u s t e o v a t e c n e l . g a l I c r s o n s a t p t t C m e i o e g ( o n n r c n t t a a / r . t n o j i l s o b l o n e e n . r c d . s e c p l s r e o t j e ( d c ) a n b e d e l e t e d )

Step 4: Clean Up Template Files (Optional)

Remove the default weather forecast files as we won’t need them:

rm Controllers/WeatherForecastController.cs
rm WeatherForecast.cs

Step 5: Verify Your .csproj File

Open McpClient_ManualIntegration.csproj and ensure it looks like this:

<Project Sdk="Microsoft.NET.Sdk.Web">

  <PropertyGroup>
    <TargetFramework>net9.0</TargetFramework>
    <Nullable>enable</Nullable>
    <ImplicitUsings>enable</ImplicitUsings>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Azure.AI.OpenAI" Version="2.5.0-beta.1" />
    <PackageReference Include="Microsoft.AspNetCore.OpenApi" Version="9.0.3" />
    <PackageReference Include="Microsoft.Extensions.Http" Version="9.0.10" />
  </ItemGroup>

</Project>

Configuration

Step 1: Update appsettings.json

Replace the contents of appsettings.json with the following configuration:

{
  "AzureOpenAI": {
    "Endpoint": "https://your-resource.openai.azure.com",
    "ApiKey": "your-api-key-here",
    "DeploymentName": "gpt-4",
    "MaxTokens": 1000,
    "Temperature": 0.7
  },
  "McpServer": {
    "BaseUrl": "https://your-mcp-server.azurewebsites.net",
    "SseEndpoint": "/sse",
    "MessageEndpoint": "/message",
    "ConnectionTimeout": 30,
    "ReconnectAttempts": 3
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  },
  "AllowedHosts": "*"
}

Step 2: Configure Your Azure OpenAI Settings

Replace the placeholder values:

  1. Endpoint: Your Azure OpenAI service endpoint

    • Find this in Azure Portal → Your OpenAI Resource → Keys and Endpoint
    • Format: https://your-resource-name.openai.azure.com
  2. ApiKey: Your Azure OpenAI API key

    • Find in Azure Portal → Your OpenAI Resource → Keys and Endpoint → KEY 1
    • Important: Never commit API keys to source control (see security section below)
  3. DeploymentName: The name of your deployed model

    • Find in Azure Portal → Your OpenAI Resource → Model deployments
    • Common names: gpt-4, gpt-35-turbo, etc.

Step 3: Update appsettings.Development.json

For local development with enhanced logging, update appsettings.Development.json:

{
  "Logging": {
    "LogLevel": {
      "Default": "Debug",
      "System": "Information",
      "Microsoft": "Information",
      "Microsoft.AspNetCore": "Warning"
    }
  }
}

Step 4: Create Configuration Models

Create a new folder called Models and add configuration classes:

mkdir Models

Create Models/AzureOpenAISettings.cs:

namespace McpClient_ManualIntegration.Models;

public class AzureOpenAISettings
{
    public required string Endpoint { get; set; }
    public required string ApiKey { get; set; }
    public required string DeploymentName { get; set; }
    public int MaxTokens { get; set; } = 1000;
    public float Temperature { get; set; } = 0.7f;
}

Create Models/McpServerSettings.cs:

namespace McpClient_ManualIntegration.Models;

public class McpServerSettings
{
    public required string BaseUrl { get; set; }
    public string SseEndpoint { get; set; } = "/sse";
    public string MessageEndpoint { get; set; } = "/message";
    public int ConnectionTimeout { get; set; } = 30;
    public int ReconnectAttempts { get; set; } = 3;
}

Step 5: Security Best Practices for Configuration

Never commit sensitive data to source control:

Create .gitignore if it doesn’t exist:

echo "appsettings.Development.json" >> .gitignore
echo "appsettings.*.json" >> .gitignore

For production: Use Managed Identity instead of API keys (see Using Managed Identity section below). This eliminates the need for Azure Key Vault and secret management entirely.

Implementing the Azure OpenAI Service

The Azure OpenAI Service handles all interactions with the Azure OpenAI API, including sending chat messages and processing tool calls.

Step 1: Implement the Azure OpenAI Service

Create a Services folder:

mkdir Services

Important Note on Authentication: This guide shows API key authentication for simplicity. For production deployments, Managed Identity is strongly recommended as it eliminates the need to store and manage API keys. See the Using Managed Identity section below for implementation details.

Create Services/AzureOpenAIService.cs:

using Azure;
using Azure.AI.OpenAI;
using McpClient_ManualIntegration.Models;
using Microsoft.Extensions.Options;
using OpenAI.Chat;

namespace McpClient_ManualIntegration.Services;

public class AzureOpenAIService(
    IOptions<AzureOpenAISettings> settings,
    ILogger<AzureOpenAIService> logger)
{
    private readonly AzureOpenAISettings _settings = settings.Value;
    private readonly AzureOpenAIClient _azureClient = new(
        new Uri(settings.Value.Endpoint),
        new AzureKeyCredential(settings.Value.ApiKey));
    private readonly ChatClient _chatClient = new AzureOpenAIClient(
        new Uri(settings.Value.Endpoint),
        new AzureKeyCredential(settings.Value.ApiKey))
        .GetChatClient(settings.Value.DeploymentName);

    public async Task<ChatCompletion> GetChatCompletionAsync(
        List<ChatMessage> messages,
        ChatCompletionOptions? options = null)
    {
        logger.LogInformation("Sending chat completion request to Azure OpenAI");

        options = EnsureChatCompletionOptions(options);
        var completion = await _chatClient.CompleteChatAsync(messages, options);

        logger.LogInformation("Received response from Azure OpenAI");
        return completion.Value;
    }

    public async Task<ChatCompletion> GetChatCompletionWithToolsAsync(
        List<ChatMessage> messages,
        IEnumerable<ChatTool> tools,
        ChatCompletionOptions? options = null)
    {
        logger.LogInformation("Sending chat completion request with tools to Azure OpenAI");

        options = EnsureChatCompletionOptions(options);

        // Add tools to the options
        foreach (var tool in tools)
        {
            options.Tools.Add(tool);
        }

        var completion = await _chatClient.CompleteChatAsync(messages, options);

        logger.LogInformation("Received response from Azure OpenAI with {ToolCallCount} tool calls",
            completion.Value.FinishReason == ChatFinishReason.ToolCalls
                ? completion.Value.ToolCalls.Count
                : 0);

        return completion.Value;
    }

    private ChatCompletionOptions EnsureChatCompletionOptions(ChatCompletionOptions? options)
    {
        return options ?? new ChatCompletionOptions
        {
            MaxOutputTokenCount = _settings.MaxTokens,
            Temperature = _settings.Temperature
        };
    }
}

Step 2: Understanding the Implementation

Key Components:

  1. Primary Constructor Syntax: Uses C# 12 primary constructor

    • Parameters declared in class declaration
    • Fields initialized inline with the constructor parameters
    • More concise than traditional constructor syntax
  2. AzureOpenAIClient: Main client for Azure OpenAI service

    • Initialized with endpoint URL and API key
    • Thread-safe and reusable
  3. ChatClient: Specific client for chat completions

    • Retrieved from AzureOpenAIClient using deployment name
    • Handles all chat-related operations
  4. Error Handling: Comprehensive logging with exception propagation

    • Logs all requests and responses
    • Exceptions propagate directly to ASP.NET Core middleware for centralized handling
  5. Configuration: Uses IOptions pattern for settings

    • Type-safe configuration
    • Supports configuration reloading

Implementing the MCP Client Service

The MCP Client Service manages the connection to the MCP server and handles tool execution requests.

Step 1: Create Request and Response Models

Create Models/McpToolResult.cs:

namespace McpClient_ManualIntegration.Models;

public class McpToolResult
{
    public bool IsSuccess { get; set; }
    public string? Content { get; set; }
    public string? Error { get; set; }
}

Create Models/McpJsonRpcRequest.cs:

using System.Text.Json.Serialization;

namespace McpClient_ManualIntegration.Models;

public class McpJsonRpcRequest
{
    [JsonPropertyName("jsonrpc")]
    public string JsonRpc { get; set; } = "2.0";

    [JsonPropertyName("id")]
    public int Id { get; set; }

    [JsonPropertyName("method")]
    public required string Method { get; set; }

    [JsonPropertyName("params")]
    public required object Params { get; set; }
}

public class ToolCallParams
{
    [JsonPropertyName("name")]
    public required string Name { get; set; }

    [JsonPropertyName("arguments")]
    public Dictionary<string, object>? Arguments { get; set; }
}

Step 2: Implement the MCP Client Service

Create Services/McpClientService.cs:

using System.Net.Http.Headers;
using System.Text;
using System.Text.Json;
using McpClient_ManualIntegration.Models;
using Microsoft.Extensions.Options;

namespace McpClient_ManualIntegration.Services;

public class McpClientService(
    IHttpClientFactory httpClientFactory,
    IOptions<McpServerSettings> settings,
    ILogger<McpClientService> logger)
{
    private readonly McpServerSettings _settings = settings.Value;

    public async Task<StreamReader> CreateSessionAsync()
    {
        var httpClient = CreateConfiguredHttpClient();
        var sseUrl = $"{_settings.BaseUrl}{_settings.SseEndpoint}";

        var request = new HttpRequestMessage(HttpMethod.Get, sseUrl);
        request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue("text/event-stream"));

        var response = await httpClient.SendAsync(request, HttpCompletionOption.ResponseHeadersRead);
        response.EnsureSuccessStatusCode();

        var stream = await response.Content.ReadAsStreamAsync();
        return new StreamReader(stream);
    }

    public async Task<McpToolResult> ExecuteToolAsync(string toolName, Dictionary<string, object>? arguments = null)
    {
        logger.LogInformation("Creating MCP session at {BaseUrl}", _settings.BaseUrl);
        var reader = await CreateSessionAsync();
        while (!reader.EndOfStream)
        {
            var line = await reader.ReadLineAsync();
            if (string.IsNullOrEmpty(line))
            {
                continue;
            }

            if (!TryExtractData(line, out var data))
            {
                continue;
            }

            if (data.Contains("message?sessionId="))
            {
                var sessionId = data.Split("sessionId=")[1];
                logger.LogInformation("Created MCP session: {SessionId}", sessionId);

                await ExecuteToolAsync(sessionId, toolName, arguments);
                continue;
            }

            var jsonDoc = JsonDocument.Parse(data);
            if (jsonDoc.RootElement.TryGetProperty("result", out var resultElement))
            {
                // Extract the content from the result
                var resultContent = resultElement.GetProperty("content")[0].GetProperty("text").GetString();

                logger.LogInformation("Tool {ToolName} executed successfully", toolName);

                return new McpToolResult
                {
                    IsSuccess = true,
                    Content = resultContent
                };
            }
        }

        return new McpToolResult
        {
            IsSuccess = false,
            Error = "No response received from MCP server"
        };
    }

    private static bool TryExtractData(string source, out string data)
    {
        const string prefix = "data: ";
        data = string.Empty;
        if (!source.StartsWith(prefix))
        {
            return false;
        }
        data = source[prefix.Length..];
        return true;
    }

    private async Task ExecuteToolAsync(string sessionId, string toolName, Dictionary<string, object>? arguments = null)
    {
        var httpClient = CreateConfiguredHttpClient();

        // Create JSON-RPC request
        var request = new McpJsonRpcRequest
        {
            Id = 1, // Simple ID since we wait for immediate response
            Method = "tools/call",
            Params = new ToolCallParams
            {
                Name = toolName,
                Arguments = arguments
            }
        };

        var json = JsonSerializer.Serialize(request);
        var content = new StringContent(json, Encoding.UTF8, "application/json");

        // Send request to MCP server
        var messageUrl = $"{_settings.BaseUrl}{_settings.MessageEndpoint}?sessionId={sessionId}";
        var response = await httpClient.PostAsync(messageUrl, content);
        response.EnsureSuccessStatusCode();
    }

    private HttpClient CreateConfiguredHttpClient()
    {
        var httpClient = httpClientFactory.CreateClient();
        httpClient.Timeout = TimeSpan.FromSeconds(_settings.ConnectionTimeout);
        return httpClient;
    }
}

Step 3: Understanding the Implementation

Key Features:

  1. Primary Constructor Syntax: Uses C# 12 primary constructor

    • Parameters declared in class declaration
    • Direct field initialization
    • More concise code
  2. Integrated Session Management:

    • CreateSessionAsync() returns a StreamReader for the SSE connection
    • Public ExecuteToolAsync() creates a session and reads responses in a single operation
    • Private ExecuteToolAsync() overload sends the JSON-RPC request with session ID
  3. Tool Execution Flow:

    • Creates a new SSE connection for each tool execution
    • Reads SSE stream until session ID is found
    • Sends JSON-RPC 2.0 formatted request to MCP server
    • Continues reading SSE stream until tool result arrives
    • Returns structured result
  4. Data Extraction Helper:

    • TryExtractData() safely extracts data from SSE “data: " prefixed lines
    • Uses modern C# range syntax ([prefix.Length..])
    • Returns false for non-data lines
  5. Simplified Architecture:

    • No explicit session ID management by callers
    • Each tool execution is self-contained
    • No background tasks or threading complexity
    • Synchronous request/response within SSE stream
  6. Error Handling:

    • Comprehensive logging at each step
    • Exceptions for connection failures propagate to ASP.NET Core middleware
    • Returns error results for parsing issues within SSE message loop

Creating the Chat Orchestrator

The Chat Orchestrator coordinates the entire flow between the user, Azure OpenAI, and the MCP server.

Step 1: Create Request and Response Models

Create Models/ChatRequest.cs:

namespace McpClient_ManualIntegration.Models;

public class ChatRequest
{
    public required string Message { get; set; }
    public string? ConversationId { get; set; }
}

Create Models/ChatResponse.cs:

namespace McpClient_ManualIntegration.Models;

public class ChatResponse
{
    public required string Message { get; set; }
    public required string ConversationId { get; set; }
    public int TokensUsed { get; set; }
}

Step 2: Implement the Chat Orchestrator

Create Services/ChatOrchestrator.cs:

using System.Text.Json;
using McpClient_ManualIntegration.Models;
using OpenAI.Chat;

namespace McpClient_ManualIntegration.Services;

public class ChatOrchestrator(
    AzureOpenAIService azureOpenAIService,
    McpClientService mcpClientService,
    ILogger<ChatOrchestrator> logger)
{
    // In-memory conversation storage (replace with database for production)
    private readonly Dictionary<string, ConversationState> _conversations = [];

    public async Task<ChatResponse> ProcessChatAsync(string userMessage, string? conversationId = null)
    {
        // Generate or retrieve conversation ID
        conversationId ??= Guid.NewGuid().ToString();

        // Get or create conversation state
        if (!_conversations.ContainsKey(conversationId))
        {
            _conversations[conversationId] = new ConversationState();
        }

        var conversation = _conversations[conversationId];

        // Add system message if this is a new conversation
        if (conversation.IsNewConversation())
        {
            conversation.NewSystemMessage(
                "You are a helpful AI assistant with access to tools for getting time information. " +
                "When users ask about time in specific locations, use the available tools to provide accurate information."
            );
        }

        // Add user message
        conversation.NewUserMessage(userMessage);

        // Define available MCP tools
        var tools = GetAvailableTools();

        // First call to Azure OpenAI with tool definitions
        var completion = await azureOpenAIService.GetChatCompletionWithToolsAsync(conversation.Messages, tools);

        var responseMessage = completion.Content.Count > 0 ? completion.Content[0].Text : string.Empty;
        var finishReason = completion.FinishReason;

        // Check if the LLM wants to use tools
        if (finishReason == ChatFinishReason.ToolCalls)
        {
            logger.LogInformation("LLM requested {ToolCallCount} tool calls", completion.ToolCalls.Count);

            // Add assistant's tool call request to conversation
            conversation.NewAssistantMessage(completion);

            // Execute each tool call
            foreach (var toolCall in completion.ToolCalls)
            {
                var toolName = toolCall.FunctionName;
                var toolArguments = ParseToolArguments(toolCall.FunctionArguments);

                logger.LogInformation("Executing tool: {ToolName}", toolName);

                // Execute the tool via MCP with the session ID
                var toolResult = await mcpClientService.ExecuteToolAsync(toolName, toolArguments);

                if (toolResult.IsSuccess)
                {
                    // Add tool result to conversation
                    conversation.NewToolMessage(toolCall.Id, toolResult.Content ?? "");
                    logger.LogInformation("Tool {ToolName} executed successfully", toolName);
                }
                else
                {
                    logger.LogError("Tool {ToolName} failed: {Error}", toolName, toolResult.Error);
                    conversation.NewToolMessage(toolCall.Id, $"Error: {toolResult.Error}");
                }
            }

            // Second call to Azure OpenAI with tool results
            var finalCompletion = await azureOpenAIService.GetChatCompletionWithToolsAsync(conversation.Messages, tools);
            responseMessage = finalCompletion.Content[0].Text;

            // Add final assistant message to conversation
            conversation.NewAssistantMessage(responseMessage);

            return new ChatResponse
            {
                Message = responseMessage,
                ConversationId = conversationId,
                TokensUsed = finalCompletion.Usage.TotalTokenCount
            };
        }

        // No tools needed, return direct response
        conversation.NewAssistantMessage(responseMessage);

        return new ChatResponse
        {
            Message = responseMessage,
            ConversationId = conversationId,
            TokensUsed = completion.Usage.TotalTokenCount
        };
    }

    // Helper class to store conversation state including MCP session
    private class ConversationState
    {
        public List<ChatMessage> Messages { get; } = [];
        public bool IsNewConversation() => Messages.Count == 0;

        public void NewSystemMessage(string message) => Messages.Add(new SystemChatMessage(message));
        public void NewUserMessage(string message) => Messages.Add(new UserChatMessage(message));
        public void NewAssistantMessage(string message) => Messages.Add(new AssistantChatMessage(message));
        public void NewAssistantMessage(ChatCompletion completion) => Messages.Add(new AssistantChatMessage(completion));
        public void NewToolMessage(string toolCallId, string content) => Messages.Add(new ToolChatMessage(toolCallId, content));
    }

    private static List<ChatTool> GetAvailableTools()
    {
        return
        [
            ChatTool.CreateFunctionTool(
                functionName: "get_current_utc_time",
                functionDescription: "Gets the current UTC time. Use this when you need to know the current time in UTC timezone.",
                functionParameters: BinaryData.FromString("{\"type\": \"object\", \"properties\": {}}")
            ),
            ChatTool.CreateFunctionTool(
                functionName: "get_current_time_for_timezone",
                functionDescription: "Gets the current time for a specific timezone. Provide a valid IANA timezone identifier (e.g., 'America/New_York', 'Europe/London', 'Asia/Tokyo').",
                functionParameters: BinaryData.FromString(@"{
                    ""type"": ""object"",
                    ""properties"": {
                        ""timeZone"": {
                            ""type"": ""string"",
                            ""description"": ""IANA timezone identifier (e.g., America/New_York, Europe/London, Asia/Tokyo)""
                        }
                    },
                    ""required"": [""timeZone""]
                }")
            )
        ];
    }

    private static Dictionary<string, object>? ParseToolArguments(BinaryData functionArguments)
    {
        if (functionArguments == null || functionArguments.ToString() == "{}")
        {
            return null;
        }

        var json = functionArguments.ToString();
        return JsonSerializer.Deserialize<Dictionary<string, object>>(json);
    }
}

Step 3: Understanding the Orchestration Flow

Flow Diagram:

U A C I s d a s e d l r l t u o R s A o e e z l q r u N Y u r c o e E A C R e m e a s x d a e s e l e d l t t s O l c l u s p R u t r a e n e t o A n g n e t e o z e A e u l u f I d r e r i t e n a r e n o w d c e a i ? r h s O l c t e u p o h s t l e r n p o t n e v t o o s A s e o n l I p r o s t o s l e v o a n a i g s t d d a c a e i e i o i o f r M n n n i e C v n c P e w i t r i t l s t i y a h o t n i t s o o n o l r e s u l t s

Key Features:

  1. Primary Constructor Syntax: Uses C# 12 primary constructor

    • Parameters declared in class declaration
    • Direct field and parameter usage
    • More concise code
  2. Conversation Management:

    • Maintains conversation history per conversation ID
    • No MCP session ID stored - sessions are created per tool execution
    • Uses collection expression syntax [] for dictionary initialization
    • Adds system message at the start of each conversation via helper methods
    • Stores all messages (user, assistant, tool calls, tool results)
  3. ConversationState Helper Methods:

    • IsNewConversation() - checks if conversation has started
    • NewSystemMessage(), NewUserMessage(), NewAssistantMessage() - add messages to conversation
    • NewToolMessage() - adds tool results to conversation
    • Encapsulates message management logic
  4. Tool Definition:

    • Defines available MCP tools in Azure OpenAI’s format
    • Uses collection expression syntax [] instead of new List<>()
    • Includes tool descriptions to help LLM decide when to use them
    • Specifies parameter schemas for each tool
  5. Multi-Turn Flow:

    • First call: Determines if tools are needed
    • Tool execution: Calls MCP server (which creates its own session) for each tool
    • Second call: Generates final response with tool results
  6. Error Handling:

    • Handles tool execution failures gracefully (returned in McpToolResult)
    • Passes errors back to LLM for appropriate response
    • Exceptions from Azure OpenAI propagate to ASP.NET Core middleware
    • Logs all operations for debugging

Building the API Endpoints

Now we’ll create the REST API endpoints that clients will use to interact with our application.

Step 1: Update Program.cs

Replace the contents of Program.cs with the following:

using McpClient_ManualIntegration.Models;
using McpClient_ManualIntegration.Services;

var builder = WebApplication.CreateBuilder(args);

// Configure settings
builder.Services.Configure<AzureOpenAISettings>(
    builder.Configuration.GetSection("AzureOpenAI"));
builder.Services.Configure<McpServerSettings>(
    builder.Configuration.GetSection("McpServer"));

// Register HTTP client factory
builder.Services.AddHttpClient();

// Register services
builder.Services.AddSingleton<AzureOpenAIService>();
builder.Services.AddScoped<McpClientService>();
builder.Services.AddScoped<ChatOrchestrator>();

var app = builder.Build();

// Map endpoints
app.MapPost("/api/chat", async (
    ChatRequest request,
    ChatOrchestrator orchestrator,
    ILogger<Program> logger) =>
{
    logger.LogInformation("Received chat request: {Message}", request.Message);

    var response = await orchestrator.ProcessChatAsync(
        request.Message,
        request.ConversationId);

    return Results.Ok(response);
});

app.MapGet("/api/health", async (
    McpClientService mcpClient) =>
{
    // Try to create a test session to verify MCP server connectivity
    await mcpClient.CreateSessionAsync();

    var health = new
    {
        Status = "Healthy",
        AzureOpenAI = "Connected",
        McpServer = "Connected",
        Timestamp = DateTime.UtcNow
    };

    return Results.Ok(health);
});

app.Run();

Step 2: Understanding the Endpoint Configuration

Service Registration:

  1. Configuration Binding:

    • Configure<T>() binds configuration sections to strongly-typed settings
    • Settings are injected via IOptions<T>
  2. Service Lifetimes:

    • Singleton: Azure OpenAI service (one instance per application, thread-safe)
    • Scoped: MCP client and Chat orchestrator (one instance per request)
    • Why Scoped for MCP? Since it’s now stateless, we use Scoped to align with request lifecycle
  3. HTTP Client Factory:

    • Manages HttpClient instances efficiently
    • Handles connection pooling and DNS refresh

Endpoints:

  1. POST /api/chat:

    • Accepts chat requests with user messages
    • Returns chat responses with AI-generated content
    • Handles conversation continuity via conversation ID
    • Exceptions propagate to ASP.NET Core middleware for centralized error handling
  2. GET /api/health:

    • Verifies MCP server connectivity by creating a test session
    • Returns health status and timestamp
    • Useful for monitoring and load balancers
    • Throws exception if MCP server is unreachable (handled by middleware)

Using Managed Identity (Alternative)

For production deployments, using Managed Identity instead of API keys is the recommended approach. This eliminates the need for Azure Key Vault and API key management.

Benefits of Managed Identity

  • No secrets to manage: No API keys stored anywhere
  • Automatic credential rotation: Azure handles it automatically
  • Simplified security: Just RBAC role assignments
  • Better compliance: Meets zero-trust security requirements

Step 1: Install Required Package

Add the Azure Identity SDK:

dotnet add package Azure.Identity

Step 2: Update AzureOpenAISettings Model

Remove the ApiKey property since it’s no longer needed:

namespace McpClient_ManualIntegration.Models;

public class AzureOpenAISettings
{
    public required string Endpoint { get; set; }
    public required string DeploymentName { get; set; }
    public int MaxTokens { get; set; } = 1000;
    public float Temperature { get; set; } = 0.7f;
}

Step 3: Update appsettings.json

Remove the API key:

{
  "AzureOpenAI": {
    "Endpoint": "https://your-resource.openai.azure.com",
    "DeploymentName": "gpt-4",
    "MaxTokens": 1000,
    "Temperature": 0.7
  }
}

Step 4: Update AzureOpenAIService Implementation

Replace the API key authentication with Managed Identity:

using Azure.AI.OpenAI;
using Azure.Identity;
using McpClient_ManualIntegration.Models;
using Microsoft.Extensions.Options;
using OpenAI.Chat;

namespace McpClient_ManualIntegration.Services;

public class AzureOpenAIService(
    IOptions<AzureOpenAISettings> settings,
    ILogger<AzureOpenAIService> logger)
{
    private readonly AzureOpenAISettings _settings = settings.Value;
    private readonly AzureOpenAIClient _azureClient = new(
        new Uri(settings.Value.Endpoint),
        new DefaultAzureCredential());
    private readonly ChatClient _chatClient = new AzureOpenAIClient(
        new Uri(settings.Value.Endpoint),
        new DefaultAzureCredential())
        .GetChatClient(settings.Value.DeploymentName);

    public async Task<ChatCompletion> GetChatCompletionAsync(
        List<ChatMessage> messages,
        ChatCompletionOptions? options = null)
    {
        logger.LogInformation("Sending chat completion request to Azure OpenAI");

        options = EnsureChatCompletionOptions(options);
        var completion = await _chatClient.CompleteChatAsync(messages, options);

        logger.LogInformation("Received response from Azure OpenAI");
        return completion.Value;
    }

    public async Task<ChatCompletion> GetChatCompletionWithToolsAsync(
        List<ChatMessage> messages,
        IEnumerable<ChatTool> tools,
        ChatCompletionOptions? options = null)
    {
        logger.LogInformation("Sending chat completion request with tools to Azure OpenAI");

        options = EnsureChatCompletionOptions(options);

        // Add tools to the options
        foreach (var tool in tools)
        {
            options.Tools.Add(tool);
        }

        var completion = await _chatClient.CompleteChatAsync(messages, options);

        logger.LogInformation("Received response from Azure OpenAI with {ToolCallCount} tool calls",
            completion.Value.FinishReason == ChatFinishReason.ToolCalls
                ? completion.Value.ToolCalls.Count
                : 0);

        return completion.Value;
    }

    private ChatCompletionOptions EnsureChatCompletionOptions(ChatCompletionOptions? options)
    {
        return options ?? new ChatCompletionOptions
        {
            MaxOutputTokenCount = _settings.MaxTokens,
            Temperature = _settings.Temperature
        };
    }
}

Note: This implementation uses C# 12 primary constructor syntax with Managed Identity authentication. The DefaultAzureCredential automatically uses:

  • Managed Identity when deployed to Azure
  • Azure CLI credentials for local development
  • Visual Studio credentials as a fallback

Step 5: Configure Managed Identity in Azure

When deploying to Azure App Service, enable System-Assigned Managed Identity:

# Enable Managed Identity for your App Service
az webapp identity assign \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client

# Get the principal ID (you'll need this for role assignment)
principalId=$(az webapp identity show \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client \
  --query principalId -o tsv)

# Assign "Cognitive Services OpenAI User" role to the Managed Identity
az role assignment create \
  --role "Cognitive Services OpenAI User" \
  --assignee $principalId \
  --scope /subscriptions/{subscription-id}/resourceGroups/{openai-resource-group}/providers/Microsoft.CognitiveServices/accounts/{openai-resource-name}

Replace the placeholders:

  • {subscription-id}: Your Azure subscription ID
  • {openai-resource-group}: Resource group containing your Azure OpenAI resource
  • {openai-resource-name}: Name of your Azure OpenAI resource

Step 6: Local Development

For local development, DefaultAzureCredential automatically uses your Azure CLI credentials:

# Login to Azure CLI
az login

# Set your default subscription
az account set --subscription {subscription-id}

Now you can run your application locally without any API keys!

Step 7: Remove Key Vault References

With Managed Identity, you no longer need:

  • Azure Key Vault setup
  • Secret storage and rotation
  • Complex access policies
  • User Secrets for local development (though you can still use them for other settings)

Summary

Before (API Key approach):

  1. Create Key Vault
  2. Store API key as secret
  3. Configure access policies
  4. Update application to retrieve secret
  5. Manage secret rotation

After (Managed Identity approach):

  1. Enable Managed Identity on App Service
  2. Assign RBAC role to identity
  3. Update code to use DefaultAzureCredential

Result: Simpler, more secure, and fully managed by Azure!

Testing and Validation

Step 1: Create Test Files

Create a .http folder in your project root:

mkdir .http

Step 2: Test Health Endpoint

Create .http/health.http:

### Health Check
GET http://localhost:5000/api/health
Accept: application/json

Step 3: Test Simple Chat (No Tools)

Create .http/chat_simple.http:

### Simple Chat Without Tools
POST http://localhost:5000/api/chat
Content-Type: application/json

{
  "message": "Hello! How are you today?"
}

Step 4: Test Chat with Tool (Single Tool Call)

Create .http/chat_with_tool.http:

### Chat Requiring Tool Call
POST http://localhost:5000/api/chat
Content-Type: application/json

{
  "message": "What time is it in New York right now?"
}

Step 5: Test Conversation Continuity

Create .http/chat_conversation.http:

### First message in conversation
POST http://localhost:5000/api/chat
Content-Type: application/json

{
  "message": "What time is it in London?"
}

###
# Copy the conversationId from the response above and use it below

### Follow-up message in same conversation
POST http://localhost:5000/api/chat
Content-Type: application/json

{
  "message": "How about in Tokyo?",
  "conversationId": "paste-conversation-id-here"
}

Step 6: Test Multiple Tool Calls

Create .http/chat_multiple_tools.http:

### Chat Requiring Multiple Tool Calls
POST http://localhost:5000/api/chat
Content-Type: application/json

{
  "message": "Can you tell me the current time in New York, London, and Tokyo?"
}

Step 7: Running Tests

  1. Start the application:

    dotnet run
    
  2. Note the port: The application will display the port (usually 5000 or 5237)

  3. Update .http files: Replace localhost:5000 with the actual port

  4. Execute tests: Use your IDE’s .http file support or a REST client

Step 8: Expected Results

Health Check Response:

{
  "status": "Healthy",
  "azureOpenAI": "Connected",
  "mcpServer": "Connected",
  "timestamp": "2025-10-14T12:00:00Z"
}

Simple Chat Response:

{
  "message": "Hello! I'm doing great, thank you for asking! I'm here to help you with any questions you have. How can I assist you today?",
  "conversationId": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
  "tokensUsed": 145
}

Chat with Tool Response:

{
  "message": "It is currently 8:56 PM (20:56) in New York (America/New_York timezone).",
  "conversationId": "b2c3d4e5-f6a7-8901-bcde-f12345678901",
  "tokensUsed": 287
}

Step 9: Validation Checklist

  • Health endpoint returns status 200 and shows both services connected
  • Simple chat works without tools and returns coherent responses
  • Chat with tool requests correctly identifies when to use tools
  • Tool execution returns accurate time information
  • Conversation continuity works (follow-up messages use context)
  • Multiple tool calls work in a single request
  • Error handling works gracefully (try invalid timezone)

Step 10: Testing Error Scenarios

Create .http/chat_error.http:

### Test Invalid Timezone (Error Handling)
POST http://localhost:5000/api/chat
Content-Type: application/json

{
  "message": "What time is it in InvalidCityName?"
}

The LLM should gracefully handle the error and provide a helpful response.

Troubleshooting

Common Issues and Solutions

Issue 1: “Cannot connect to Azure OpenAI”

Symptoms:

  • Error: "The remote name could not be resolved"
  • Status 401 or 403 responses

Solutions:

  1. Verify Endpoint URL:

    # Check your endpoint in appsettings.json
    # It should match: https://[your-resource-name].openai.azure.com
    
  2. Verify API Key:

    # Check if API key is correctly set
    dotnet user-secrets list
    
    # If missing, set it:
    dotnet user-secrets set "AzureOpenAI:ApiKey" "your-actual-key"
    
  3. Check Azure Portal:

    • Go to your Azure OpenAI resource
    • Navigate to “Keys and Endpoint”
    • Verify endpoint URL and copy a valid key
  4. Verify Deployment Name:

    • Go to Azure Portal → Your OpenAI Resource → Model deployments
    • Ensure the deployment name in appsettings.json matches exactly

Issue 2: “Cannot connect to MCP server”

Symptoms:

  • Error: "Failed to obtain session ID from MCP server"
  • Timeout errors

Solutions:

  1. Test MCP Server Directly:

    curl --no-buffer https://your-mcp-server.azurewebsites.net/sse
    
  2. Check Network Connectivity:

    • Ensure your firewall allows outbound HTTPS connections
    • Verify no proxy blocking the connection
  3. Increase Timeout: Update appsettings.json:

    "McpServer": {
      "ConnectionTimeout": 60,  // Increase to 60 seconds
      ...
    }
    

Issue 3: “Tool execution fails”

Symptoms:

  • Error: "Session not found"
  • Tool returns null or error

Solutions:

  1. Check Session Creation:

    • Ensure CreateSessionAsync() is called to get a session ID
    • Verify session ID is being passed to ExecuteToolAsync()
    • Check logs for session creation status
  2. Verify Tool Names:

    • Tool names must match exactly: get_current_utc_time, get_current_time_for_timezone
    • Check capitalization and underscores
  3. Check Tool Arguments:

    // Correct format:
    {
      "timeZone": "America/New_York"  // Valid IANA timezone
    }
    
    // Invalid:
    {
      "timezone": "EST"  // Wrong parameter name and format
    }
    

Issue 4: “Request timeout”

Symptoms:

  • Long wait times before timeout
  • No response received

Solutions:

  1. Check Azure OpenAI Quota:

    • Verify you haven’t exceeded token limits
    • Check Azure Portal for quota usage
  2. Reduce Token Limits:

    "AzureOpenAI": {
      "MaxTokens": 500,  // Reduce from 1000
      ...
    }
    
  3. Enable Request Logging: Update appsettings.Development.json:

    {
      "Logging": {
        "LogLevel": {
          "Default": "Debug",
          "McpClient": "Debug"
        }
      }
    }
    

Issue 5: “JSON deserialization error”

Symptoms:

  • Error: "The JSON value could not be converted"
  • Parsing errors in logs

Solutions:

  1. Check Request Format: Ensure your .http files use proper JSON:

    Content-Type: application/json
    
    {
      "message": "Your message here"  // No trailing comma
    }
    
  2. Verify Response Handling:

    • Check that SSE responses match expected format
    • Review logs for actual response content

Issue 6: “Conversation not maintained”

Symptoms:

  • Follow-up questions don’t use context
  • Bot seems to forget previous messages

Solutions:

  1. Use Conversation ID:

    {
      "message": "Follow-up question",
      "conversationId": "paste-id-from-first-response"
    }
    
  2. Check Memory Storage:

    • Current implementation uses in-memory dictionary
    • Conversations lost on application restart
    • For production, use persistent storage (Redis, Cosmos DB)

Debugging Tips

  1. Enable Verbose Logging:

    {
      "Logging": {
        "LogLevel": {
          "Default": "Trace"
        }
      }
    }
    
  2. Use Breakpoints:

    • Set breakpoints in ChatOrchestrator.ProcessChatAsync
    • Inspect messages array and tool calls
    • Verify tool results
  3. Test Components Independently:

    Test Azure OpenAI:

    // Add a test endpoint to Program.cs
    app.MapGet("/test/openai", async (AzureOpenAIService openai) =>
    {
        var messages = new List<ChatMessage>
        {
            new UserChatMessage("Say hello")
        };
        var result = await openai.GetChatCompletionAsync(messages);
        return Results.Ok(result.Content[0].Text);
    });
    

    Test MCP Connection:

    // Add a test endpoint to Program.cs
    app.MapGet("/test/mcp", async (McpClientService mcp) =>
    {
        var sessionId = await mcp.CreateSessionAsync();
        return Results.Ok(new { sessionId });
    });
    
  4. Check Application Insights (if configured):

    • View distributed traces
    • Analyze failed requests
    • Monitor dependencies

Getting Help

If you’re still experiencing issues:

  1. Check Logs: Review application logs for detailed error messages
  2. Verify Configuration: Double-check all settings in appsettings.json
  3. Test Network: Ensure connectivity to both Azure OpenAI and MCP server
  4. Review Documentation:

Deployment to Azure

Step 1: Prepare for Deployment

  1. Update appsettings.json for Production:

Remove sensitive data:

{
  "AzureOpenAI": {
    "Endpoint": "https://your-resource.openai.azure.com",
    "ApiKey": "",  // Will be set via environment variables
    "DeploymentName": "gpt-4",
    "MaxTokens": 1000,
    "Temperature": 0.7
  },
  "McpServer": {
    "BaseUrl": "https://your-mcp-server.azurewebsites.net",
    "SseEndpoint": "/sse",
    "MessageEndpoint": "/message",
    "ConnectionTimeout": 30,
    "ReconnectAttempts": 3
  }
}
  1. Create .gitignore:
bin/
obj/
appsettings.Development.json
*.user

Step 2: Install Azure CLI

If you haven’t already:

# Windows (using winget)
winget install Microsoft.AzureCLI

# macOS (using Homebrew)
brew install azure-cli

# Linux
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash

Login to Azure:

az login

Step 3: Create Azure Resources

  1. Create Resource Group:
az group create \
  --name rg-mcp-client \
  --location eastus
  1. Create App Service Plan:
az appservice plan create \
  --name plan-mcp-client \
  --resource-group rg-mcp-client \
  --sku B1 \
  --is-linux
  1. Create Web App:
az webapp create \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client \
  --plan plan-mcp-client \
  --runtime "DOTNET|9.0"

Step 4: Configure Application Settings

Set environment variables for your app:

az webapp config appsettings set \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client \
  --settings \
    AzureOpenAI__Endpoint="https://your-resource.openai.azure.com" \
    AzureOpenAI__ApiKey="your-api-key" \
    AzureOpenAI__DeploymentName="gpt-4" \
    AzureOpenAI__MaxTokens="1000" \
    AzureOpenAI__Temperature="0.7" \
    McpServer__BaseUrl="https://your-mcp-server.azurewebsites.net"

Security Best Practice: For production, use Managed Identity instead of API keys (see Using Managed Identity section). If you must use API keys, configure them as application settings rather than in configuration files.

Step 5: Deploy Your Application

Option 1: Deploy using Azure CLI

# Build and publish
dotnet publish -c Release -o ./publish

# Create deployment package
cd publish
zip -r ../deploy.zip .
cd ..

# Deploy to Azure
az webapp deployment source config-zip \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client \
  --src deploy.zip

Option 2: Deploy using Visual Studio

  1. Right-click project → Publish
  2. Choose Azure → Azure App Service (Linux)
  3. Select your subscription and app service
  4. Click Publish

Option 3: Deploy using GitHub Actions

Create .github/workflows/deploy.yml:

name: Deploy to Azure

on:
  push:
    branches: [ main ]

jobs:
  build-and-deploy:
    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v2

    - name: Setup .NET
      uses: actions/setup-dotnet@v1
      with:
        dotnet-version: '9.0.x'

    - name: Build
      run: dotnet build --configuration Release

    - name: Publish
      run: dotnet publish -c Release -o ./publish

    - name: Deploy to Azure
      uses: azure/webapps-deploy@v2
      with:
        app-name: 'your-mcp-client-app'
        publish-profile: ${{ secrets.AZURE_WEBAPP_PUBLISH_PROFILE }}
        package: ./publish

Step 6: Verify Deployment

  1. Test Health Endpoint:
curl https://your-mcp-client-app.azurewebsites.net/api/health
  1. Test Chat Endpoint:
curl -X POST https://your-mcp-client-app.azurewebsites.net/api/chat \
  -H "Content-Type: application/json" \
  -d '{"message": "What time is it in New York?"}'
  1. Create Application Insights:
az monitor app-insights component create \
  --app insights-mcp-client \
  --location eastus \
  --resource-group rg-mcp-client \
  --application-type web
  1. Link to Web App:
az webapp config appsettings set \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client \
  --settings \
    APPLICATIONINSIGHTS_CONNECTION_STRING="[connection-string-from-above]"
  1. Add Application Insights SDK:
dotnet add package Microsoft.ApplicationInsights.AspNetCore

Update Program.cs:

// Add this line before building the app
builder.Services.AddApplicationInsightsTelemetry();

Step 8: Set Up Continuous Deployment

  1. Configure deployment from Git:
az webapp deployment source config \
  --name your-mcp-client-app \
  --resource-group rg-mcp-client \
  --repo-url https://github.com/your-username/your-repo \
  --branch main \
  --manual-integration

Deployment Checklist

  • Azure resources created (Resource Group, App Service Plan, Web App)
  • Application settings configured (Azure OpenAI, MCP server)
  • Sensitive data moved to Key Vault or User Secrets
  • Application deployed successfully
  • Health endpoint returns 200 OK
  • Chat endpoint responds correctly
  • Application Insights configured (optional)
  • Continuous deployment configured (optional)

Next Steps

Congratulations! You’ve successfully built and deployed an ASP.NET Core application that integrates Azure OpenAI with an MCP server.

Enhancements to Consider

  1. Persistent Conversation Storage:

    • Replace in-memory dictionary with Redis or Cosmos DB
    • Implement conversation history cleanup policies
    • Add conversation retrieval endpoints
  2. Advanced Error Handling:

    • Implement retry policies with Polly
    • Add circuit breaker for external services
    • Create custom error responses
  3. Security Improvements:

    • Add authentication (JWT, OAuth 2.0)
    • Implement rate limiting
    • Add input validation and sanitization
    • Use Azure Managed Identity instead of API keys
  4. Performance Optimization:

    • Implement response caching
    • Add connection pooling configuration
    • Optimize conversation history management
    • Implement streaming responses
  5. Monitoring and Observability:

    • Add custom telemetry events
    • Create dashboards in Application Insights
    • Set up alerts for errors and performance
    • Implement distributed tracing
  6. Additional MCP Tools:

    • Add more MCP tools to your server
    • Implement dynamic tool discovery
    • Create tool categories and permissions
  7. Frontend Development:

    • Build a web UI for chat interface
    • Create mobile app integration
    • Add real-time updates with SignalR

Learning Resources