Skip to main content

Tool support in OllamaSharp

Yesterday I talked about OllamaSharp as an alternative (to Semantic Kernel) to talk to your Ollama endpoint using C#. The reason I wanted to directly use Ollama and not use Semantic Kernel was because I wanted to give the recently announced Ollama Tool support a try. And that is exactly what we are going to explore in this post.


Keep reading...

Tool support

Tool support allows a model to answer a given prompt using tools it knows about, making it possible to interact with the outside world and do things like for example calling an API. It makes your model a lot smarter as it can start using information that was not part of the originally trained model and do more things than just returning a response.

Remark: A similar feature exists in Semantic Kernel through the concept of Plugins but as far as I’m aware the Plugins are not using the Ollama tools support (yet).  When writing this post I noticed that a new Ollama connector was released for Semantic Kernel which uses OllamaSharp behind the scenes. I’ll have a look if this new connector allows to call the Ollama tool support and update this post with my findings. Promised!

To show the power of tools calling, we first ask some weather information using the example we created yesterday:


The model nicely explains us that it cannot access weather info.

OK. Let’s start creating a tool that allows to return the weather for a location. Important is that we give a good description to the tool and to the parameters expected by the tool:

var tool = new Tool()
{
Function = new Function
{
Description = "Get the current weather for a location",
Name = "GetWeather",
Parameters = new Parameters
{
Properties = new Dictionary<string, Properties>
{
["location"] = new()
{
Type = "string",
Description = "The location to get the weather for, e.g. San Francisco, CA"
},
["format"] = new()
{
Type = "string",
Description = "The format to return the weather in, e.g. 'celsius' or 'fahrenheit'",
Enum = ["celsius", "fahrenheit"]
},
},
Required = ["location", "format"],
}
},
Type = "function"
};
view raw Tool.cs hosted with ❤ by GitHub

Now we can update our chat example from yesterday and pass on the tool instance:

var uri = new Uri("http://localhost:11434"); //Default Ollama endpoint
var ollama = new OllamaApiClient(uri);
ollama.SelectedModel = "llama3.1:latest";
string GetWeather(string location, string format)
{
//Call the weather API here
return $"The weather in {location} is 25 degrees {format}";
}
var chat = new Chat(ollama);
while (true)
{
Console.Write("User>");
var message = Console.ReadLine();
Console.Write("Assistant>");
await foreach (var answerToken in chat.Send(message, tools: [tool]))//We pass our tool when calling the LLM
Console.Write(answerToken);
//Check the latest message to see if a tool was called
foreach (var toolCall in chat.Messages.Last().ToolCalls)
{
var arguments = string.Join(",",toolCall.Function.Arguments.Select(kvp => $"{kvp.Key}: {kvp.Value}"));
Console.WriteLine($"Tool called:{toolCall.Function.Name} with following arguments: {arguments}");
}
Console.WriteLine();
}
view raw CallTool.cs hosted with ❤ by GitHub

Remark: Notice that we also changed the model used. Right now a limited list of models support tools calling.

If we now ask for the weather, we’ll get the following response back:

What wasn’t immediately clear to me is that the Ollama tools support will not invoke the tool for you. So you need to call it yourself based on the provided ToolCall information you get back from the LLM.

Here is an updated version where I call the GetWeather method I have available in my code:

//Check the latest message to see if a tool was called
foreach (var toolCall in chat.Messages.Last().ToolCalls)
{
if (toolCall.Function.Name == nameof(GetWeather))
Console.WriteLine(GetWeather(toolCall.Function.Arguments["location"], toolCall.Function.Arguments["format"]));
}
view raw InvokeTool.cs hosted with ❤ by GitHub

Now we get the following results back:

Nice! 

Remark: Be aware that this code is far from production ready but at least you get the idea.

More information

Tool support · Ollama Blog

Introducing new Ollama Connector for Local Models | Semantic Kernel (microsoft.com)

wullemsb/OllamaSharpDemo: Tools demo for OllamaSharp (github.com)

Popular posts from this blog

Kubernetes–Limit your environmental impact

Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...

Azure DevOps/ GitHub emoji

I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

.NET 9 - Goodbye sln!

Although the csproj file evolved and simplified a lot over time, the Visual Studio solution file (.sln) remained an ugly file format full of magic GUIDs. With the latest .NET 9 SDK(9.0.200), we finally got an alternative; a new XML-based solution file(.slnx) got introduced in preview. So say goodbye to this ugly sln file: And meet his better looking slnx brother instead: To use this feature we first have to enable it: Go to Tools -> Options -> Environment -> Preview Features Check the checkbox next to Use Solution File Persistence Model Now we can migrate an existing sln file to slnx using the following command: dotnet sln migrate AICalculator.sln .slnx file D:\Projects\Test\AICalculator\AICalculator.slnx generated. Or create a new Visual Studio solution using the slnx format: dotnet new sln --format slnx The template "Solution File" was created successfully. The new format is not yet recognized by VSCode but it does work in Jetbr...