I'm a big fan of Ollama as a way to try and run a large language model locally. Today I got into trouble when I tried to connect to Ollama. When I tried to run Ollama through ollama serve I got the following error message: time=2024-12-02T21:15:55.398+01:00 level=ERROR source=common.go:279 msg="empty runner dir" Error: unable to initialize llm runners unable to locate runners in any search path I was able to fix the issue by going the AppData\Local\Ollama folder. There inside updates I found a new(er) version that I installed manually by executing the OllamaSetup.exe. After the setup completed, Ollama was running again as expected. More information Ollama
With the 17.12 version of Visual Studio 2022 there comes a feature that I was waiting to be added for a long time(and with long I really mean long). Of course you are wondering what feature I'm talking about. Let me first set the scene by showing you the class I want to debug: I created a small Calculator example. Notice that I'm using 2 different syntaxes(the regular syntax and expression-bodied method). This is not an accidental inconsistency from my side as you’ll see later. Now what if I wanted to debug the return values of these functions. Before the latest Visual Studio update I typically used a temporary variable to inspect the return values or took a look at the Autos window or the Watch window. With this release, you finally see the return statements inline in the editor window. Here is an example where I added the breakpoint at the end of the function: Unfortunately this doesn’t work (yet?) when using the expression-bodied method syntax, this is because ...