Skip to main content

Semantic Kernel–Agent Framework

In this post I show you the recently introduced Semantic Kernel agents feature and how it simplifies building your own AI agents. But maybe I should start with a short recap about Semantic Kernel.

On the documentation pages, Semantic Kernel is described like this:

Semantic Kernel is a lightweight, open-source development kit that lets you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase. It serves as an efficient middleware that enables rapid delivery of enterprise-grade solutions.

It gives you all the building blocks required to build your own agent; a chat completion model, a plugin system, a planner and more. However until recently you had to bring all this building blocks together yourself.

Here is a small code snippet I copied from an existing project:

// Import packages
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// Create a kernel with Azure OpenAI chat completion
var builder = Kernel.CreateBuilder().AddAzureOpenAIChatCompletion(modelId, endpoint, apiKey);
// Add enterprise components
builder.Services.AddLogging(services => services.AddConsole().SetMinimumLevel(LogLevel.Trace));
// Build the kernel
Kernel kernel = builder.Build();
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
// Initialize plugin and add to the kernel
kernel.Plugins.AddFromType<MenuPlugin>("Menu");
// Enable auto invocation of plugins
OpenAIPromptExecutionSettings openAIPromptExecutionSettings = new() 
{
ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
};
// Create a history store the conversation
var history = new ChatHistory("You are a waiter answering questions about the menu.");
// Initiate a back-and-forth chat
string? userInput;
do {
// Collect user input
Console.Write("User > ");
userInput = Console.ReadLine();
// Add user input
history.AddUserMessage(userInput);
// Get the response from the AI
var result = await chatCompletionService.GetChatMessageContentAsync(
history,
executionSettings: openAIPromptExecutionSettings,
kernel: kernel);
// Print the results
Console.WriteLine("Assistant > " + result);
// Add the message from the agent to the chat history
history.AddAssistantMessage(result.Content ?? string.Empty);
} while (userInput is not null);

There are a lot of things going on in the code above and if you have hard time to understand all of this I have some good news for you. Starting with the Python (1.6.0) and .NET releases (1.18.0 RC1), Semantic Kernel now provides a first-class abstraction for agents.

To use it, we first need to add the following NuGet package:

dotnet add package Microsoft.SemanticKernel.Agents.Core

Let’s rewrite the code above to use the new agent abstraction:

// Import packages
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
// Create a kernel with Azure OpenAI chat completion
var builder = Kernel.CreateBuilder().AddAzureOpenAIChatCompletion(modelId, endpoint, apiKey);
// Add enterprise components
builder.Services.AddLogging(services => services.AddConsole().SetMinimumLevel(LogLevel.Trace));
// Build the kernel
Kernel kernel = builder.Build();
// Define the agent
ChatCompletionAgent agent =
new()
{
Instructions = "Answer questions about the menu.",
Name = "Waiter",
Kernel = kernel,
Arguments = new KernelArguments(new OpenAIPromptExecutionSettings() { ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions }),
};
// Initialize plugin and add to the agent's Kernel (same as direct Kernel usage).
KernelPlugin plugin = KernelPluginFactory.CreateFromType<MenuPlugin>();
agent.Kernel.Plugins.Add(plugin);
// Create the chat history to capture the agent interaction.
ChatHistory chat = [];
// Initiate a back-and-forth chat
string? userInput;
do
{
// Collect user input
Console.Write("User > ");
userInput = Console.ReadLine();
// Add user input
chat.AddUserMessage(userInput);
// Get the response from the AI
#pragma warning disable SKEXP0110 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
await foreach (ChatMessageContent response in agent.InvokeAsync(chat))
{
// Print the results
Console.WriteLine("Assistant > " + response.ToString());
// Add the message from the agent to the chat history
chat.Add(response);
}
#pragma warning restore SKEXP0110 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
} while (userInput is not null);
view raw Agent.cs hosted with ❤ by GitHub

This is already an improvement but you still have to manage the chat history yourself.

If you are using an OpenAI based model, you can go one step further and use the OpenAI assistant abstraction so that the state is managed for you:

//Define the agent
OpenAIAssistantAgent agent =
await OpenAIAssistantAgent.CreateAsync(
kernel: new(),
config: new("<ApiKey>", "<Endpoint URL>"),
new()
{
Name = "Waiter",
Instructions = "Answer questions about the menu.",
ModelId = "<ModelId>",
EnableCodeInterpreter = false,
});
// Initialize plugin and add to the agent's Kernel (same as direct Kernel usage).
KernelPlugin plugin = KernelPluginFactory.CreateFromType<MenuPlugin>();
agent.Kernel.Plugins.Add(plugin);
// Initiate a back-and-forth chat
string threadId = await agent.CreateThreadAsync();
string? userInput;
do
{
// Collect user input
Console.Write("User > ");
userInput = Console.ReadLine();
await agent.AddChatMessageAsync(threadId, new ChatMessageContent(AuthorRole.User, userInput));
// Get the response from the AI
await foreach (ChatMessageContent message in agent.InvokeAsync(threadId))
{
// Print the results
Console.WriteLine("Assistant > " + message.ToString());
}
} while (userInput is not null);

Nice!

Remark: Everything I showing here is still in preview and will probably change in the future.

More information

semantic-kernel/dotnet/samples/GettingStartedWithAgents at main · microsoft/semantic-kernel (github.com)

Introducing enterprise multi-agent support in Semantic Kernel | Semantic Kernel (microsoft.com)

Popular posts from this blog

Kubernetes–Limit your environmental impact

Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...

Azure DevOps/ GitHub emoji

I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

.NET 9 - Goodbye sln!

Although the csproj file evolved and simplified a lot over time, the Visual Studio solution file (.sln) remained an ugly file format full of magic GUIDs. With the latest .NET 9 SDK(9.0.200), we finally got an alternative; a new XML-based solution file(.slnx) got introduced in preview. So say goodbye to this ugly sln file: And meet his better looking slnx brother instead: To use this feature we first have to enable it: Go to Tools -> Options -> Environment -> Preview Features Check the checkbox next to Use Solution File Persistence Model Now we can migrate an existing sln file to slnx using the following command: dotnet sln migrate AICalculator.sln .slnx file D:\Projects\Test\AICalculator\AICalculator.slnx generated. Or create a new Visual Studio solution using the slnx format: dotnet new sln --format slnx The template "Solution File" was created successfully. The new format is not yet recognized by VSCode but it does work in Jetbr...