Skip to main content

Structured output with Ollama

I talked about Structured Output before in the context of Semantic Kernel. Structured Outputs is a feature that ensures the model will always generate responses that follow a supplied JSON Schema, so you can process the responses in automated fashion without worrying about getting invalid JSON back.

Recently support for Structured Output was announced by Ollama. In this post I want to show you how you can use this in combination with OllamaSharp, the C# client for Ollama.

Using Structured Ouput in OllamaSharp

Remark: Make sure you have latest Ollama version running on your local machine before you continue.

  • Add the OllamaSharp client to your project:

dotnet add package OllamaSharp

  • Now let’s first define our response model:
// Define response models
public class Recipe
{
public List<Ingredient> Ingredients { get; set; }
}
public class Ingredient
{
public string Name { get; set; }
public string Quantity { get; set; }
public string Unit { get; set; }
}
view raw Recipe.cs hosted with ❤ by GitHub
  • Afterwards we need to initiate a new OllamaSharp client instance:
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
ollama.SelectedModel = "llama3.1:latest";
view raw Client1.cs hosted with ❤ by GitHub
  • We create a new request object.
    • Notice that we specify a JSON schema object based on the Recipe model we created earlier
var request = new GenerateRequest()
{
Prompt = "What are the ingredients needed to prepare a Christmas Turkey?",
Format = JsonSchema.ToJsonSchema(typeof(Recipe))
};
view raw Client2.c hosted with ❤ by GitHub
  • If we now invoke the application we get a JSON response back in the provided format:

The full example can be found here: wullemsb/SemanticKernelStructuredOutput at OllamaClient

What about using the Semantic Kernel Ollama connector?

At the moment of writing this blog post, I couldn’t get the Structured Output working when combining Semantic Kernel and Ollama.

I explicitly referenced the correct OllamaSharp version:

<Project Sdk="Microsoft.NET.Sdk">
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Configuration.UserSecrets" Version="6.0.1" />
<PackageReference Include="Microsoft.SemanticKernel" Version="1.30.0" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.Ollama" Version="1.32.0-alpha" />
<PackageReference Include="OllamaSharp" Version="4.0.11" />
</ItemGroup>
</Project>
view raw Example.csproj hosted with ❤ by GitHub

And used the Ollama connector:

var httpClient = new HttpClient();
httpClient.BaseAddress = new Uri("http://localhost:11434");
httpClient.Timeout = TimeSpan.FromSeconds(120);
IConfiguration configuration = builder.Build();
Kernel kernel = Kernel.CreateBuilder()
.AddOllamaChatCompletion("llama3.1",httpClient: httpClient)
.Build();
// Initialize ChatResponseFormat object with JSON schema of desired response format.
ChatResponseFormat chatResponseFormat = ChatResponseFormat.CreateJsonSchemaFormat(
jsonSchemaFormatName: "recipe",
jsonSchema: BinaryData.FromString("""
{
"type": "object",
"properties": {
"Ingredients": {
"type": "array",
"items": {
"type": "object",
"properties": {
"Name": { "type": "string" },
"Quantity": { "type": "string" },
"Unit": { "type": "string" }
},
"required": ["Name", "Quantity", "Unit"],
"additionalProperties": false
}
}
},
"required": ["Ingredients"],
"additionalProperties": false
}
"""),
jsonSchemaIsStrict: true);
// Specify response format by setting ChatResponseFormat object in prompt execution settings.
var executionSettings = new OpenAIPromptExecutionSettings
{
ResponseFormat = chatResponseFormat
};
// Send a request and pass prompt execution settings with desired response format.
var result = await kernel.InvokePromptAsync("What are the ingredients needed to prepare a Christmas Turkey?", new(executionSettings));
view raw Client3.cs hosted with ❤ by GitHub

But when calling the api, I still got a response back in an unstructured format:

More information

Semantic Kernel - Structured output

awaescher/OllamaSharp: The easiest way to use the Ollama API in .NET

Support for 'Structured outputs' · awaescher/OllamaSharp · Discussion #152

Structured outputs · Ollama Blog

Popular posts from this blog

Kubernetes–Limit your environmental impact

Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...

Azure DevOps/ GitHub emoji

I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

.NET 9 - Goodbye sln!

Although the csproj file evolved and simplified a lot over time, the Visual Studio solution file (.sln) remained an ugly file format full of magic GUIDs. With the latest .NET 9 SDK(9.0.200), we finally got an alternative; a new XML-based solution file(.slnx) got introduced in preview. So say goodbye to this ugly sln file: And meet his better looking slnx brother instead: To use this feature we first have to enable it: Go to Tools -> Options -> Environment -> Preview Features Check the checkbox next to Use Solution File Persistence Model Now we can migrate an existing sln file to slnx using the following command: dotnet sln migrate AICalculator.sln .slnx file D:\Projects\Test\AICalculator\AICalculator.slnx generated. Or create a new Visual Studio solution using the slnx format: dotnet new sln --format slnx The template "Solution File" was created successfully. The new format is not yet recognized by VSCode but it does work in Jetbr...