Yesterday I showed how to run STRIDE GPT, an AI based threat modelling tool, locally using docker. I demonstrated how I used a local language model through Ollama running on the same machine as Docker Desktop.
To be able to access the Ollama endpoint from inside the docker container, I had to use the host.docker.internal
as you can see in this .env file:
A colleague asked me, what if you are using Podman instead of Docker? Will host.docker.internal
still work?
The short answer is no.
Luckily this doesn’t have to be the end of this post as an alternative exists for podman. Instead of using host.docker.internal
you need to use host.containers.internal
.