Ollama remains my go to tool to run LLM’s locally. With the latest release the Ollama team introduced a user interface. This means you no longer need to use the command line or tools like OpenWebUI to interact with the available language models.
After installing the latest release, you are welcomed by a new chat window similar to ChatGPT:
Interacting with the model can be done directly through the UI:
A history of earlier conversations is stored and available:
You can easily switch between models by clicking on the model dropdown. If a model is not yet available locally, you can download it immediately by clicking on the Download icon:
If you need a bigger context window, you can now change this directly from the settings:
Some other feature worth mentioning are the file support (simply drag and drop a file to the chat window) and multimodal support:
More information
Explore and test local modals using Ollama and OpenWebUI