Documentation Index Fetch the complete documentation index at: https://mintlify.com/modelcontextprotocol/csharp-sdk/llms.txt
Use this file to discover all available pages before exploring further.
The MCP C# SDK integrates seamlessly with Microsoft.Extensions.AI , allowing MCP tools to work directly with any IChatClient implementation.
Quick Start
Connect to an MCP server and use its tools with an LLM:
using Microsoft . Extensions . AI ;
using ModelContextProtocol . Client ;
using OpenAI ;
// Connect to MCP server
var transport = new StdioClientTransport ( new ()
{
Command = "npx" ,
Arguments = [ "-y" , "@modelcontextprotocol/server-everything" ]
});
await using var mcpClient = await McpClient . CreateAsync ( transport );
// Get tools from server
var tools = await mcpClient . ListToolsAsync ();
// Create LLM client
using IChatClient chatClient = new OpenAIClient ( apiKey )
. AsChatClient (" gpt -4 o ")
. AsBuilder ()
. UseFunctionInvocation ()
. Build ();
// Chat with tools
List < ChatMessage > messages = [
new ( ChatRole . User , "What's the weather in Seattle?" )
];
var response = await chatClient . GetResponseAsync (
messages ,
new ChatOptions { Tools = [ .. tools ] }
);
Console . WriteLine ( response . Message );
MCP tools implement AIFunction from Microsoft.Extensions.AI, making them compatible with any IChatClient:
await using var client = await McpClient . CreateAsync ( transport );
// List all available tools
var tools = await client . ListToolsAsync ();
foreach ( var tool in tools )
{
Console . WriteLine ( $"Tool: { tool . Name } " );
Console . WriteLine ( $"Description: { tool . Description } " );
}
// Tools can be passed directly to ChatOptions
var options = new ChatOptions
{
Tools = [ .. tools ]
};
Streaming Conversations
Use streaming for real-time responses:
using Anthropic ;
using Microsoft . Extensions . AI ;
using ModelContextProtocol . Client ;
// Connect to server
var transport = new StdioClientTransport ( new ()
{
Command = "dotnet" ,
Arguments = [ "run" , "--project" , "../WeatherServer" ]
});
await using var mcpClient = await McpClient . CreateAsync ( transport );
var tools = await mcpClient . ListToolsAsync ();
// Create Anthropic client with function invocation
using var chatClient = new AnthropicClient ( apiKey )
. AsChatClient ( "claude-4.5-sonnet-20250514" )
. AsBuilder ()
. UseFunctionInvocation ()
. Build ();
var messages = new List < ChatMessage >();
var sb = new StringBuilder ();
while ( true )
{
Console . Write ( "You: " );
var input = Console . ReadLine ();
if ( string . IsNullOrWhiteSpace ( input )) break ;
messages . Add ( new ChatMessage ( ChatRole . User , input ));
// Stream response
await foreach ( var update in chatClient . GetStreamingResponseAsync (
messages ,
new ChatOptions { Tools = [ .. tools ] }
))
{
Console . Write ( update );
sb . Append ( update . ToString ());
}
Console . WriteLine ();
messages . Add ( new ChatMessage ( ChatRole . Assistant , sb . ToString ()));
sb . Clear ();
}
Multi-Provider Support
The same MCP tools work with any Microsoft.Extensions.AI provider:
using OpenAI ;
var chatClient = new OpenAIClient ( apiKey )
. AsChatClient ( "gpt-4o" )
. AsBuilder ()
. UseFunctionInvocation ()
. Build ();
var response = await chatClient . GetResponseAsync (
messages ,
new ChatOptions { Tools = [ .. tools ] }
);
The UseFunctionInvocation() middleware automatically handles tool calls. You can also invoke tools manually:
await using var client = await McpClient . CreateAsync ( transport );
// Call a tool directly
var result = await client . CallToolAsync (
"get_weather" ,
new Dictionary < string , object ?>
{
[ "location" ] = "Seattle, WA" ,
[ "unit" ] = "fahrenheit"
}
);
// Extract text content
var textContent = result . Content
. OfType < TextContentBlock >()
. FirstOrDefault () ? . Text ;
Console . WriteLine ( textContent );
Progress Reporting
Monitor tool execution progress:
var progress = new Progress < ProgressNotificationValue >( p =>
{
Console . WriteLine ( $"Progress: { p . Progress } / { p . Total } " );
});
var result = await client . CallToolAsync (
"long_running_task" ,
arguments : null ,
progress : progress
);
React to tool list changes:
await using var disposable = client . RegisterNotificationHandler (
NotificationMethods . ToolListChangedNotification ,
async ( notification , cancellationToken ) =>
{
Console . WriteLine ( "Tool list changed, refreshing..." );
var tools = await client . ListToolsAsync ();
// Update your ChatOptions with new tools
}
);
Complete Example
Here’s a complete interactive chat application:
using Microsoft . Extensions . AI ;
using Microsoft . Extensions . Configuration ;
using Microsoft . Extensions . Hosting ;
using ModelContextProtocol . Client ;
using OpenAI ;
var builder = Host . CreateApplicationBuilder ( args );
builder . Configuration
. AddEnvironmentVariables ()
. AddUserSecrets < Program >();
// Connect to MCP server
var transport = new StdioClientTransport ( new ()
{
Command = "npx" ,
Arguments = [ "-y" , "@modelcontextprotocol/server-everything" ],
Name = "Everything Server"
});
await using var mcpClient = await McpClient . CreateAsync ( transport );
// Get available tools
var tools = await mcpClient . ListToolsAsync ();
Console . WriteLine ( $"Connected with { tools . Count } tools available" );
// Create LLM client
using var chatClient = new OpenAIClient (
builder . Configuration [ "OPENAI_API_KEY" ]
)
. AsChatClient ( "gpt-4o" )
. AsBuilder ()
. UseFunctionInvocation ()
. Build ();
var messages = new List < ChatMessage >();
var options = new ChatOptions
{
Tools = [ .. tools ]
};
Console . WriteLine ( "Chat started! Type 'exit' to quit. \n " );
while ( true )
{
Console . Write ( "You: " );
var input = Console . ReadLine ();
if ( string . IsNullOrWhiteSpace ( input ) ||
input . Equals ( "exit" , StringComparison . OrdinalIgnoreCase ))
{
break ;
}
messages . Add ( new ChatMessage ( ChatRole . User , input ));
var response = await chatClient . GetResponseAsync ( messages , options );
Console . WriteLine ( $"Assistant: { response . Message } " );
messages . Add ( response . Message );
}
Best Practices
Always enable function invocation : Use .UseFunctionInvocation() in your IChatClient builder to automatically handle tool calls.
Dispose clients properly : Always use await using with McpClient to ensure proper cleanup of server processes and connections.
Tool updates : Register handlers for ToolListChangedNotification to stay synchronized when servers add or remove tools dynamically.
Next Steps
Sampling Handle server sampling requests
Roots Provide filesystem roots to servers