If you are a C# developer and want to interact with Ollama(which allows you to interact with Large Language Models locally), the easiest solution is to use Semantic Kernel. This is possible because Ollama exposes an OpenAI compatible API.
However I wanted to try some Ollama specific features that were not yet exposed through Semantic Kernel. Does this mean that I can no longer use C#?
Remark: While writing this post I noticed that an Ollama connector was released for Semantic Kernel that also uses OllamaSharp behind the scenes.
The good news is you still can. Thanks to OllamaSharp you get .NET bindings for the Ollama API.
Getting started
Let’s write a simple demo application to try OllamaSharp:
- Create a new Console application:
dotnet new console -o OllamaSharpDemo
- Add the OllamaSharp Nuget package:
dotnet add package ollamasharp
- Now we can start writing our code. First create a new OllamaApiClient instance and specify the model we’ll use:
- Next is to create a new chat and start interacting with our LLM through Ollama:
- That’s it!
Tip: Also have a look at the OllamaSharpConsole app from the same author to interact with your Ollama endpoint:
More information
Running large language models locally using Ollama (bartwullems.blogspot.com)
Introduction to Semantic Kernel | Microsoft Learn
awaescher/OllamaSharp: The easiest way to use the Ollama API in .NET (github.com)
awaescher/OllamaSharpConsole: Full featured demo application for OllamaSharp (github.com)