Skip to main content

Posts

Tweak your LLM models with Ollama–Using OpenWebUI

Yesterday I explained how we can create and upload our own language models in Ollama through the usage of a modelfile. I explained the modelfile format and the different building blocks that can be used to define and configure a model. Today I want to continue on my previous post by explaining how to use OpenWebUI instead of doing everything by hand. Start by opening OpenWebUI (checkout my previous post on how to get it up and running): Click on the Workspace section on the left:   Click on the + button in the Models section on the right: Start editing your modelfile:   Hit Save & Create at the bottom:   After saving the new model, you can immediately test it: More information Explore and test local modals using Ollama and OpenWebUI Models | Open WebUI
Recent posts

Tweak your LLM models with Ollama

If you want to create and share your own model through Ollama or tweak an existing model, you need to understand the Ollama Model file. The model file is the blueprint to create and share models with Ollama. Understanding the Ollama model file Let us first  have a look an existing model file to give you an example. Therefore you can use the following command: ollama show <modelname> --modelfile Let’s give it a try: ollama show phi4:latest --modelfile # Modelfile generated by "ollama show" # To build a new Modelfile based on this, replace FROM with: # FROM phi4:latest FROM C:\Users\bawu\.ollama\models\blobs\sha256-fd7b6731c33c57f61767612f56517460ec2d1e2e5a3f0163e0eb3d8d8cb5df20 TEMPLATE """{{- range $i, $_ := .Messages }} {{- $last := eq (len (slice $.Messages $i)) 1 -}} < |im_start|>{{ .Role }}<|im_sep|> {{ .Content }}{{ if not $last }}<|im_end|> {{ end }} {{- if and...

Running LLMs locally using LM Studio

As I like to experiment a lot with AI, I always have to be careful and keep my token usage under control. And although the token cost has decreased over time for most models, the expenses can go up quite fast. That is one of the reasons I like to use  (Large) Language Models locally.  There are multiple ways to run a model locally but my preferred way so far was Ollama (together with OpenWebUI ). I also experimented with Podman AI Lab but I always returned to Ollama in the end.  Recently a colleague introduced me to LM Studio , another tool to run and test LLM’s locally. With LM Studio, you can: Run LLMs offline on your local machine Download and run models from Hugging Face Integrate your own application with a local model using the LM Studio SDK or through the OpenAI endpoints Use the built-in RAG support to chat with your local documents More than enough reasons to give it a try… Getting started I downloaded the installer from the websi...

Visual Studio 2022 – GitHub Copilot Vision support

I talked about how you could start using Vision in GitHub Code before. At that time, it was however limited to VS Code. With the latest release of Visual Studio 2022 (17.13), this feature is now also available inside Visual Studio. Important: As this feature is in preview, it may not be available to all users. Due to the gradual rollout, you may not see the option to attach images in chat immediately. Get Started If this feature is available for you, a new paperclip icon appears in your chat window:   Now you can either directly paste an image from your clipboard or use the paperclip icon to upload an image: Once uploaded you can add your prompt and send everything to Copilot. Copilot will then analyze the image and use it as additional context while generating a response.   Nice! More information Adding vision to Github Copilot

.NET 9 - Goodbye sln!

Although the csproj file evolved and simplified a lot over time, the Visual Studio solution file (.sln) remained an ugly file format full of magic GUIDs. With the latest .NET 9 SDK(9.0.200), we finally got an alternative; a new XML-based solution file(.slnx) got introduced in preview. So say goodbye to this ugly sln file: And meet his better looking slnx brother instead: To use this feature we first have to enable it: Go to Tools -> Options -> Environment -> Preview Features Check the checkbox next to Use Solution File Persistence Model Now we can migrate an existing sln file to slnx using the following command: dotnet sln migrate AICalculator.sln .slnx file D:\Projects\Test\AICalculator\AICalculator.slnx generated. Or create a new Visual Studio solution using the slnx format: dotnet new sln --format slnx The template "Solution File" was created successfully. The new format is not yet recognized by VSCode but it does work in Jetbr...

C# - Set environment in unit tests

When monitoring our RabbitMQ usage, I noticed some strange behavior. After drilling a little bit deeper in our monitoring data, I discovered that the integration tests for one of our projects were running against our production environment! Whoops!! Luckily the impact was limited but that doesn't mean we don't have to fix it. The fix Let me show you how we did it… The unit test code was using the Microsoft.AspNetCore.Mvc.Testing library to create an in-memory test host to test an API. Microsoft.AspNetCore.Mvc.Testing is part of the ASP.NET Core ecosystem and aims to simplify integration testing by providing essential tools and setup. Here is the original code: There is nothing wrong with the code itself and it makes it very easy to write and end-to-end integration test against our web app: One of the api endpoints puts messages on a RabbitMQ queue to process them later. A test instance exists but as I mentioned at the beginning of this post, instead of using th...

Disable NuGet Central Package Management for specific projects

Centralized Package Management is a NuGet feature that allows you to manage NuGet package dependencies for multiple projects from a single location. This is particularly useful for large solutions with many projects, as it simplifies the process of keeping package versions consistent across the entire solution. After enabling Central Package Management for an existing solution and committing those changes, one of our build pipelines started to fail. When we took a look at the logs we encountered the following error message: ##[error]The nuget command failed with exit code(1) and error(NU1008: Projects that use central package version management should not define the version on the PackageReference items but on the PackageVersion items: Dapper;Microsoft.Data.SqlClient;Microsoft.Extensions.Configuration;Microsoft.Extensions.Configuration.Binder;Microsoft.Extensions.Configuration.Json;Microsoft.Extensions.Http;Spectre.Console;Spectre.Console.Json. What is happening? The reason we...