Our journey continues, as we keep finding new features to explore in the Microsoft.Extensions.AI library. Today we have a look at the support for MCP. This post is part of a blog series. Other posts so far: Part I – An introduction to Microsoft.Extensions.AI Part II – ASP.NET Core integration Part III –Tool calling Part IV – Telemetry integration Part V – Chat history Part VI – Structured output Part VII - MCP integration (this post) What is MCP? I know, I know, you should have been hiding under a rock if you have never heard about MCP before. But just in case; MCP (Model Context Protocol) is an open protocol developed by Anthropic that provides a standardized way to connect AI models to different data sources and tools. This allows us to use tool calling without having to build our own plugins (as I demonstrated in Part III of this blog series). Using MCP with Microsoft.Extensions.AI The first thing you need is an MCP server. Today there...
Still not at the end of our journey, as we keep finding new features to explore in the Microsoft.Extensions.AI library. Today we have a look at the support for Structured Output. This post is part of a blog series. Other posts so far: Part I – An introduction to Microsoft.Extensions.AI Part II – ASP.NET Core integration Part III –Tool calling Part IV – Telemetry integration Part V – Chat history Part VI – Structured output (this post) What is structured output? By default, the LLM replies in free form text. This is great during chat conversations but not so great if you want to use the LLM response in a programmatic context. By using structured output, you can specify a JSON schema that describes the exact output the LLM should return. Using structured output with Microsoft.Extensions.AI To use structured output with Microsoft.Extensions.AI you have specific methods available in the ChatClientStructuredOutputExtensions class. By passing a generi...