Still not at the end of our journey, as we keep finding new features to explore in the Microsoft.Extensions.AI
library. Today we have a look at the support for Structured Output.
This post is part of a blog series. Other posts so far:
- Part I – An introduction to Microsoft.Extensions.AI
- Part II – ASP.NET Core integration
- Part III –Tool calling
- Part IV – Telemetry integration
- Part V – Chat history
- Part VI – Structured output (this post)
What is structured output?
By default, the LLM replies in free form text. This is great during chat conversations but not so great if you want to use the LLM response in a programmatic context. By using structured output, you can specify a JSON schema that describes the exact output the LLM should return.
Using structured output with Microsoft.Extensions.AI
To use structured output with Microsoft.Extensions.AI you have specific methods available in the ChatClientStructuredOutputExtensions class.
By passing a generic argument, a JSON schema is inferred based on the provided class.
Another option is to set the ResponseFormat
property in the ChatOptions
:
More information
ChatClientStructuredOutputExtensions Class (Microsoft.Extensions.AI) | Microsoft Learn
Quickstart - Request a response with structured output - .NET | Microsoft Learn