Yesterday I talked about OllamaSharp as an alternative (to Semantic Kernel) to talk to your Ollama endpoint using C#. The reason I wanted to directly use Ollama and not use Semantic Kernel was because I wanted to give the recently announced Ollama Tool support a try. And that is exactly what we are going to explore in this post.
Keep reading...
Tool support
Tool support allows a model to answer a given prompt using tools it knows about, making it possible to interact with the outside world and do things like for example calling an API. It makes your model a lot smarter as it can start using information that was not part of the originally trained model and do more things than just returning a response.
Remark: A similar feature exists in Semantic Kernel through the concept of Plugins but as far as I’m aware the Plugins are not using the Ollama tools support (yet). When writing this post I noticed that a new Ollama connector was released for Semantic Kernel which uses OllamaSharp behind the scenes. I’ll have a look if this new connector allows to call the Ollama tool support and update this post with my findings. Promised!
To show the power of tools calling, we first ask some weather information using the example we created yesterday:
The model nicely explains us that it cannot access weather info.
OK. Let’s start creating a tool that allows to return the weather for a location. Important is that we give a good description to the tool and to the parameters expected by the tool:
Now we can update our chat example from yesterday and pass on the tool instance:
Remark: Notice that we also changed the model used. Right now a limited list of models support tools calling.
If we now ask for the weather, we’ll get the following response back:
What wasn’t immediately clear to me is that the Ollama tools support will not invoke the tool for you. So you need to call it yourself based on the provided ToolCall information you get back from the LLM.
Here is an updated version where I call the GetWeather method I have available in my code:
Now we get the following results back:
Nice!
Remark: Be aware that this code is far from production ready but at least you get the idea.
More information
Introducing new Ollama Connector for Local Models | Semantic Kernel (microsoft.com)
wullemsb/OllamaSharpDemo: Tools demo for OllamaSharp (github.com)