Skip to main content

Posts

Podman 5.3.1 - Error: unable to connect to Podman socket: open \.ssh\known_hosts:

Today I was helping a colleague to get an existing application containerized. To run the containerized workload we were using Podman Desktop. When we opened Podman Desktop, we got the question if we wanted to upgrade the podman version. Of course we said yes! Unfortunately after the upgrade was completed, podman failed to start and always returned the following error message: > podman build Cannot connect to Podman. Please verify your connection to the Linux system using `podman system connection list`, or try `podman machine init` and `podman machine start` to manage a new Linux VM Error: unable to connect to Podman socket: open C:\Users\<username>\.ssh\known_hosts: The system cannot find the path specified. We were able to continue by applying a workaround. By manually creating the .ssh folder in the user directory podman was able to run successfully. Problem solved! More information Missing known_hosts file in Windows prevents connection as of 5.3.0 · Issue ...
Recent posts

Podman Error: no context directory and no Containerfile specified

This one is just a small reminder for myself. Yesterday I talked about how I helped a colleague to get started with Podman through Podman Desktop. Today he contacted me again when trying to build an updated image of the application. Here is what he tried to do: >podman build This failed with the following error message: Error: no context directory and no Containerfile specified Do you notice what is missing? He forgot to specify the context directory. In this case he was running the command at the same location that the Dockerfile could be found. So we added a ‘.’ to specify that the current directory could be used: >podman build . If the dockerfile could be found at another location, just specify the relative or absolute path: >podman build path/to/directory That’s it!

WSL–Getting started experience

Windows Subsystem for Linux (WSL) lets developers run a Linux environment directly on Windows, without the overhead of a traditional virtual machine or dual-boot setup. It is a very convenient way for developers to access and use a Linux environment. But although it not that difficult to get started , it can still take some time before you really understand how to use it. If you also had some trouble to get started with WSL2, or even never took a look at it before, I have some good news for you. Starting from WSL 2.4.4(Pre-release), you get a ‘Getting Started’ experience. Let me show… Getting Started experience First we need to make sure that we download and install the latest pre-release version. You can find the correct installer here . Download the msi file and run the installer. After the installation has completed, open the freshly installed WSL Settings app:   The WSL settings app gives you easy access to a lot of the options that can be configured w...

RAG Deep Dive series

Hello everyone, in case you missed the announcement , today a new deep dive series started about how to combine Azure OpenAI Service and Azure AI Search to build a powerful RAG solution.   This 10 parts series consists of the following parts: The RAG solution for Azure Date: 13 January, 2025 Time: 11:30 PM UTC | 3:30 PM PT Customizing our RAG solution Date: 15 January, 2025 Time: 11:30 PM UTC | 3:30 PM PT Optimal retrieval with Azure AI Search Date: 20 January, 2025 Time: 11:30 PM UTC | 3:30 PM PT Multimedia data ingestion Date: 22 January, 2025 Time: 11:30 PM UTC | 3:30 PM PT User login and data access control Date: 27 January, 2025 Time: 11:30 PM UTC | 3:30 PM PT Storing chat history Date: 29 January, 2025 Time: 11:30 PM UTC | 3:30 PM PT Adding speech input and output Date: 3 February, 2025 Time: 11:30 PM UTC | 3:30 PM PT Private deployment Date: 5 February, 2025 Time: 11:30 PM UTC | 3:30 PM PT Evaluating RAG answer quality Date: 10 February, 2025 Time: 11:30 PM UTC | 3:30 PM...

NuGet - Transitive pinning in Central Package Management

I'm a big fan of the Central Package Management feature of NuGet. This allows to manage your NuGet package versions centrally instead of at the project level. Why is this important? It helps to avoid that multiple versions of the same packages are used inside your solution. This could lead to strange behavior and unexpected bugs. Central Package Management solves this problem by managing all package versions at the solution level in (one or more) Directory.Packages.props files: Transitive dependencies By default only direct/top level dependencies are managed through Central Package Management. But you also have transitive dependencies . Transitive dependencies refer to the indirect dependencies that a package brings into your project. When you install a NuGet package, it might depend on other packages, which in turn might have their own dependencies. These chains of dependencies create a dependency graph that can go several levels deep. Here's a breakdown of how it w...

Why asking for the ROI of AI in software development is the wrong question

When talking to customers about AI in software development: “What will be the ROI?” remains a question that always pops up. On the surface, this seems like a valid question. After all, organizations need to justify investments in new technologies. But when it comes to AI in software development, I think this question is misguided and distracts from AI's transformative potential in modern software engineering. The proof-of-concept disease Many organizations are stuck in a cycle of proof of concept (PoC) projects aimed at demonstrating the value of AI in software development. They allocate resources to measure AI’s impact on developer productivity or code quality, hoping to calculate some tangible ROI. But here’s my opinion: We’re way past the point where AI’s usefulness in software development is debatable. Tools like GitHub Copilot, AI-powered testing frameworks, and automated code reviews have already shown that AI can enhance the software development lifecycle (SDLC...

Prevent breaking changes using Microsoft.CodeAnalysis.PublicApiAnalyzers

For one of my clients, I maintain a set of libraries that help to streamline the development of new applications in .NET Core. The functionality offered through these libraries follow the 'paved road' principle and help the development teams to fall in the pit of success. They are used by multiple development teams working on different projects. As a library author, I don’t have full control on all the different ways these libraries are used so I have to be very conscious in avoiding breaking changes. In this post I want to share some details on how I try to manage and avoid breaking changes. Microsoft.CodeAnalysis.PublicApiAnalyzers The ‘magic’ solution I use to avoid or handle breaking changes is with the help of a specific Roslyn analyzer, the Microsoft.CodeAnalysis.PublicApiAnalyzers . Let me explain how this analyzer works. First add a package reference to Microsoft.CodeAnalysis.PublicApiAnalyzers to your project: dotnet add package Microsoft.CodeAnalysis.Pub...