Skip to main content

Posts

Showing posts from 2024

Podman Desktop–The WSL import of guest OS failed: exit status 0xffffffff

T o avoid carrying around multiple laptops for different customers I typically ask if they can provide me a VDI (for example through Azure Virtual Desktop). One of my clients is not on a cloud platform (yet), so the VM they provided me was running on a local(read: in a traditional datacenter) hyper-v cluster. As we were investigating to move to a container based development model, I installed Podman Desktop on the provided machine. Podman Desktop requires a Podman machine to be created to be able to run workloads. However when I tried to create a new machine, it failed with the following error message: Error: the WSL import of guest OS failed: exit status 0xffffffff I did a second attempt through the commandline but this failed as well: podman machine init Extracting compressed file: podman-machine-default-amd64: done Importing operating system into WSL (this may take a few minutes on a new WSL install)... WSL2 is not supported with your current machine configuration. Pl

GraphQL–Application Insights integration for HotChocolate 13

If you are a regular reader of my blog, you've certainly seen my earlier post on how to integrate Application Insights telemetry in your HotChocolate based GraphQL backend. IMPORTANT: Although the code below will still work, I would recommend to switch to the integrated OpenTelemetry functionality and send the information to Application Insights in this way. Today I had to upgrade an application to HotChocolate 13 and (again) I noticed that the application no longer compiled. So for people who are using the ApplicationInsightsDiagnosticsListener I shared, here is an updated version that works for HotChocolate 13: As I wanted to track the full operation I have overwritten the ExecuteRequest method, however if you only want to log any exception, I would recommend to override the RequestError and ResolverError methods: More information HotChocolate OpenTelemetry (bartwullems.blogspot.com) GraphQL HotChocolate 11 - Updated Application Insights monitoring (bartwullem

Azure Application Insights–Collect Performance Counters data - Part II

About 2 years ago I blogged about the possibility to collect performance counter data as part of your Application Insights telemetry. If you missed the original post, first have a quick read here and then come back. Back? As mentioned in the original post, to be able to collect this performance data, the Application Pool user needs to be added to both the Performance Monitor Users and Performance Log Users group. Although I updated our internal wiki to emphasize this, I still noticed that most developers forget to do this with no collected data as a result. (Who reads documentation anyway?) So let’s do what any developers does in this case, let’s automate the problem away… I created a small tool that can be executed on the IIS web server. It goes through all the application pools, extracts the corresponding user and add it to the 2 groups as mentioned above. I think the code is quite self-explanatory. Here is the part where I read out the application pools: And here I ad

MassTransit–Change log level

A short post today; a colleague wanted to investigate a production issue when sending and receiving messages using MassTransit. He couldn’t find immediately how to change to capture more details about the MassTransit internals. I had a look together with him. So for further reference you can find the answer here. Using Serilog On this specific project he was using Serilog so let me first show you how to change the log level for MassTransit when using Serilog: Using Microsoft.Extensions.Logging And here is the code changes required when using the default Microsoft.Extensions.Logging: Remark: I would certainly recommend to have a look at the observability features of MassTransit as well and look to implement OpenTelemetry in your application. More information Logging · MassTransit Observability · MassTransit

ImageSharp–Image.Load() Compiler error after updating

On one of my pet projects I’m using ImageSharp to manipulate images. It is a great library that comes with a lot of features in it. And performance and memory usage is good which is quite important when manipulating images.  Remark: Be aware that a commercial license is needed in some conditions. So check out the pricing page before adding this library to your application. I decided to add some new stuff but before I did that I updated the library to the latest version. However after doing that my original code no longer compiled. Let’s see how we can fix this… Here is the original code: This specific overload no longer exists in the latest version. So I started to update my code to a supported overload: Remark: Notice that I also switched to an async overload but know that a synchronous version exists as well. However by doing that I no longer had the format information available. The good news is that this information could now be accessed through the image.Metadata

Code & Comedy 2024–This session will give you 2.6 hours of your time back

Yesterday I had the pleasure to present at Code & Comedy 2024 . Again it was a great combination of inspiring sessions, great food and a lot of nice people to meet. All of this followed by a Comedy Act by Jan Jaap van der Wal. I never thought that AI could be that much fun! Note to myself: Next time make sure that I’m not one of the 2 Flemish guys in the room. I did a presentation titled " This session will give you 2,6h a day of your time back!"; which is a hard promise to make, so I hope that the participants could confirm if I succeeded or not. In the session I shared some surprising insights in behavioral science and explained how we can apply this knowledge in building AI assistants. In case you missed my session or you want to a look at the code in more detail, here are the relevant links: Presentation: wullemsb/presentations: Repo with all my (public) presentations (github.com) Source code: wullemsb/SemanticKernel: Demo code for my AI session (githu

Github Copilot extension disabled after upgrading to Visual Studio 17.10.2

After upgrading my Visual Studio to 17.10.2, I got a warning that my Github Copilot and Github Copilot Chat extensions are incompatible with my Visual Studio version and will be disabled. The following extensions are disabled because the installed version if not compatible with Visual Studio Pro 17.10.34505.107. - GitHub Copilot - GitHub Copilot Chat I started to get a small panic because once you are used to have a Copilot assistant you never want to go back. So after my Visual Studio was up and running, I opened the extension window and indeed both extensions were disabled: I first tried to reinstall the extension but I got the same incompatibility message. Does this mean that I can no longer use Copilot? Luckily the answer is no.  Starting from Visual Studio 17.10.2 Github Copilot and Copilot Chat are no longer separate extensions but became part of Visual Studio. So although the extensions are no longer there, Copilot (Chat) was still available as I found out. A bette

List Process IDs (PIDs) for your IIS Application Pools

One of our production servers started to send out alerts because the memory consumption was too high. A first look at the server showed us that one of the w3wp processes was consuming most of the available resources on the server.   Sidenote: Web applications running within Microsoft’s Internet Information Services (IIS) utilize what is known as IIS worker processes. These worker processes run as w3wp.exe, and there are typically multiple per server. So we knew that one of our web applications was causing trouble. The question of course now is which one? As the resource manager only showed us the process id, we couldn’t immediately give the answer. But with some extra command line magic we got the answer. To get the information we need we used the appcmd utility. This tool allows; among other things; to list the IIS worker processes. We executed appcmd list wps and got the the list of worker processes with their PIDs. c:\Windows\System32\inetsrv>appcmd list wps WP

The Red Hat cloud native architecture solution patterns

Being a software architect, I'm always looking for good resources that can help me design better solutions. One of these resources I can recommend are the Solution Patterns from Red Hat . Although mainly focused on Red Hat technologies, most of the described patterns are applicable in a broader context. For every solution pattern, you get more information about the problems it tackles and uses cases it solves, a reference architecture and a technical implementation. For example, let’s have a look at API versioning . As you can see you get some use cases where API versioning plays a role: And a high level solution: If you want to further drill down in the details, you can have an in-depth look at the solution’s architecture. With all this information you can make an informed decision if this solution pattern could help you in solving your specific needs. So bookmark this link and go explore the other solution patterns . More information Solution Patterns from Red H

SonarQube–Troubleshooting

After upgrading a SonarQube server at one of my clients, we encountered some issues. In this post I'll walk you through each problem we experienced and how we solved it. Let’s start! java.lang.IllegalStateException: Fail to connect to database Our first exception doesn’t tell that much. So let’s have a look at the logs: 2024.05.10 10:43:33 INFO  web[][o.s.p.ProcessEntryPoint] Starting Web Server 2024.05.10 10:43:33 INFO  web[][o.s.s.a.TomcatHttpConnectorFactory] Starting Tomcat on port 9000 2024.05.10 10:43:35 INFO  web[][o.s.s.p.LogServerVersion] SonarQube Server / 10.5.1.90531 / a69dd88808638ec33015277e236bca319314ce8b 2024.05.10 10:43:35 INFO  web[][o.s.d.DefaultDatabase] Create JDBC data source for jdbc:sqlserver://server:1433;databaseName=Tfs_Sonar;integratedSecurity=true 2024.05.10 10:43:35 INFO  web[][c.z.h.HikariDataSource] HikariPool-1 - Starting... 2024.05.10 10:43:37 WARN  web[][o.s.c.a.AnnotationConfigApplicationContext] Exception encountered during co

Why pair programming should be your default way of working

Although pair programming is not a new (software development) practice, I don't see it used that often at my clients. I think that the main reason for this is that because not that many developers tried it and therefore never experienced the benefits of pair programming. Unknown is unloved. But I experienced first handed that once you get comfortable with pair programming, you don’t want to go back. Most teams I worked with that started using it continued with the practice; sometimes partially sometimes full time.   Benefits of pair programming Pair programming has several key benefits and purposes, which can significantly enhance the efficiency and effectiveness of development teams. The Power of Two Minds: 1+1 > 2 One of the primary benefits of pair programming is the enhanced knowledge sharing that occurs when two developers collaborate closely. This collaboration helps in various ways: Knowledge Sharing : Both programmers continuously learn from each o

Debugging Semantic Kernel in C#

I'm working on a (small) project where I'm using Microsoft Semantic Kernel . Although it makes it really easy to build your own copilots and AI agents, it is not that easy to understand what is going on (or what is going wrong). Just in case you have no clue what Semantic Kernel is: Semantic Kernel is an open-source SDK that lets you easily build agents that can call your existing code. It is highly extensible and  you can use Semantic Kernel with models from OpenAI , Azure OpenAI , Hugging Face , and more. It can be used in C#, Python and Java code. After adding the Microsoft.Semantic.Kernel.Core and Microsoft.Semantic.Kernel.Connectors.OpenAI nuget packages, the simplest implementation to write is the following: That’s easy and there is not much magic going on as you are directly controlling the prompt that is shared with the LLM. However when you start to use the more complex features like the planner , you don't know what exactly is shared with the LLM. In