Skip to main content

Tool support in OllamaSharp

Yesterday I talked about OllamaSharp as an alternative (to Semantic Kernel) to talk to your Ollama endpoint using C#. The reason I wanted to directly use Ollama and not use Semantic Kernel was because I wanted to give the recently announced Ollama Tool support a try. And that is exactly what we are going to explore in this post.


Keep reading...

Tool support

Tool support allows a model to answer a given prompt using tools it knows about, making it possible to interact with the outside world and do things like for example calling an API. It makes your model a lot smarter as it can start using information that was not part of the originally trained model and do more things than just returning a response.

Remark: A similar feature exists in Semantic Kernel through the concept of Plugins but as far as I’m aware the Plugins are not using the Ollama tools support (yet).  When writing this post I noticed that a new Ollama connector was released for Semantic Kernel which uses OllamaSharp behind the scenes. I’ll have a look if this new connector allows to call the Ollama tool support and update this post with my findings. Promised!

To show the power of tools calling, we first ask some weather information using the example we created yesterday:


The model nicely explains us that it cannot access weather info.

OK. Let’s start creating a tool that allows to return the weather for a location. Important is that we give a good description to the tool and to the parameters expected by the tool:


Now we can update our chat example from yesterday and pass on the tool instance:

Remark: Notice that we also changed the model used. Right now a limited list of models support tools calling.

If we now ask for the weather, we’ll get the following response back:

What wasn’t immediately clear to me is that the Ollama tools support will not invoke the tool for you. So you need to call it yourself based on the provided ToolCall information you get back from the LLM.

Here is an updated version where I call the GetWeather method I have available in my code:

Now we get the following results back:

Nice! 

Remark: Be aware that this code is far from production ready but at least you get the idea.

More information

Tool support · Ollama Blog

Introducing new Ollama Connector for Local Models | Semantic Kernel (microsoft.com)

wullemsb/OllamaSharpDemo: Tools demo for OllamaSharp (github.com)

Popular posts from this blog

Azure DevOps/ GitHub emoji

I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

Kubernetes–Limit your environmental impact

Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...

Podman– Command execution failed with exit code 125

After updating WSL on one of the developer machines, Podman failed to work. When we took a look through Podman Desktop, we noticed that Podman had stopped running and returned the following error message: Error: Command execution failed with exit code 125 Here are the steps we tried to fix the issue: We started by running podman info to get some extra details on what could be wrong: >podman info OS: windows/amd64 provider: wsl version: 5.3.1 Cannot connect to Podman. Please verify your connection to the Linux system using `podman system connection list`, or try `podman machine init` and `podman machine start` to manage a new Linux VM Error: unable to connect to Podman socket: failed to connect: dial tcp 127.0.0.1:2655: connectex: No connection could be made because the target machine actively refused it. That makes sense as the podman VM was not running. Let’s check the VM: >podman machine list NAME         ...