Skip to main content


Showing posts from May, 2021

Azure Pipelines–Build completion triggers

I’m currently migrating an existing CI/CD pipeline build in Azure DevOps from the ‘classic’ build approach to YAML templates. It was quite a journey. So expect a lot of posts the upcoming days about this topic where I share everything I learned along the way. In our original setup we had an Azure DevOps classic pipeline that was used to create a Docker image and push it to ACR (“the CI part”). After that a release pipeline was triggered that took the image from ACR and deploys it in multiple AKS clusters (“the CD part”). The goal was to keep this way of working and only make the switch from the ‘classic’ build approach to YAML templates. So the first thing I had to find out was how I could trigger the CD build when the CI build completed. Let me first show you the approach that is most similar to the classic approach using ‘Build completion triggers’. Although this approach still works, it is no longer recommended. The recommended approach is to specify pipeline triggers direc

NSwag error - System.InvalidOperationException: No service for type 'Microsoft.Extensions.DependencyInjection.IServiceProviderFactory`1[Autofac.ContainerBuilder]' has been registered.

In one of our projects we are using NSwag to generate our TypeScript DTO’s and services used in our Angular frontend. In this project we are using Autofac as our IoC container and have created a few extension methods that hook into the HostBuilder bootstrapping. Unfortunately our custom logic brought NSwag into trouble and caused our build to fail with the following error message: System.InvalidOperationException: No service for type 'Microsoft.Extensions.DependencyInjection.IServiceProviderFactory`1[Autofac.ContainerBuilder]' has been registered. NSwag adds an extra build target in your csproj file and uses that to run the NSwag codegenerator tool: While investigating the root cause of this issue, we introduced a small workaround where we used a separate program.cs and startup.cs specifically for the NSwag codegenerator. We added a minimal Program.cs file: And a minimal Startup.cs file: The magic to make this work is to change the nswag.json configurat

Microsoft Build 2021–Book of news

Microsoft continues it’s new tradition to bundle all important announcements in a ‘ Book of news ’. So ,if you don’t have the time to watch Satya Nadella's keynote or the Scott Hanselman and friends version of the keynote have a look at the long list of available sessions , the book of news is there to guarantee you are up to speed with the latest and greatest in the Microsoft ecosystem.

Azure Pipelines–Automatic Package Versioning

There are a few options available when configuring your nuget package versioning scheme in your build pipeline. Let’s explore the differences: Off The package version is not changed during the build. Instead the version provided in your csproj file is used. Use the date and time When selecting the ‘Use the date and time’ option, it is up to you to provide a Major, Minor and Patch version number. The datetime will be used as the prerelease label: $(Major).$(Minor).$(Patch).$(date:yyyyMMdd) Use an environment variable A third option is to use an environment variable. The environment variable should contain the version number that you want to use and should be in a valid format. Remark: You should enter the name of the environment variable without $, $env, or %. Use the build number A last option is to use the build number. This will use the build number to version the package. Remark: This will change the build number format to a version compatible defini

.NET Coding Pack

I blogged before about the coding packs for Visual Studio Code . A coding pack is an all-in-one installers, that will set you up in Python, Java,… As of this week the list of available coding packs got extended with a coding pack for C# and .NET. You can download the pack from . Walk through the different steps of the installation wizard to start your C# development journey: After completing the installation, Visual Studio is launched. Now some extra extensions are added to VSCode: Once everything is installed, a .NET Interactive Notebook is available that allows you to learn about and try out all C# features in an interactive way(similar to Jupyter notebooks):

NETSDK1005–Asset file is missing target

When trying to build a project using Azure Pipelines, it failed with the following error message: NETSDK1005–Asset file is missing target This error message is not very descriptive and it was not immediatelly obvious where the mistake was. In this case the problem was related to the fact I was combining an older NuGet.exe to restore the nuget packages in combination with a .NET 5 project. From the documentation: NuGet writes a file named project.assets.json in the obj folder, and the .NET SDK uses it to get information about packages to pass into the compiler. In .NET 5, NuGet added a new field named TargetFrameworkAlias , so earlier versions of MSBuild or NuGet generate an assets file without the new field. To fix the issue I had to change the NuGet Tool Installer task to use NuGet version 5.8 or higher: More information:

SonarQube - SQL Server Integrated Security

While moving our build agents from one server to another, I also had to move our SonarQube instance. To install the SonarQube instance I followed the instructions mentioned here: As I was using SQL Server with Integrated Security, I took special attention when reading this section: To use integrated security: Download the Microsoft SQL JDBC Driver 9.2.0 package and copy mssql-jdbc_auth-9.2.0.x64.dll to any folder in your path. If you're running SonarQube as a Windows service, make sure the Windows account under which the service is running has permission to connect your SQL server. The account should have db_owner database role membership. If you're running the SonarQube server from a command prompt, the user under which the command prompt is running should have db_owner database role membership. Ensure that sonar.jdbc.username or sonar.jdbc.password properties are

Azure Pipelines–SonarQube analysis

After moving our build agents from one server to another, one of the builds no longer worked. When looking at the logs, I noticed that the build failed almost immediatelly with the following error message: No agent found in pool Azure Pipelines which satisfies the specified demands: java And indeed when I took a look at the specific pipeline demands I could see that java was required. Turns out that the SonarQube analysis requires java installed on the build server to be able to execute. As the build server was a new clean install, nothing was installed yet. Install Java OpenJDK Time to fix that… The license terms of the Oracle JDK has changed and updates are no longer free. Therefore I choose to install the Azul OpenJDK(download it here ). I used the MSI and walked through the installation wizard. After completing the setup I manually created the JAVA_HOME environment variable and set it to the bin  folder of the Zulu installation(e.g. C:\Program Files\Zulu\zulu-11\b

NuGet - Add Global Package source to your build server

A few years ago I blogged about  how to add a global package source on your build server . Over the years NuGet has evolved and the approach described in that blog post no longer applies. For the latest NuGet version, the config files are located here: %appdata%\NuGet\NuGet.config You can either directly manipulate the values inside this NuGet.config or you can add an extra package source through the nuget sources command: nuget sources add -Name “My Custom Package Source” – Source https://myfeedlocation/nuget/v3/index.json More information: Remark 1: This config file is scoped to the current user. So it is important to execute this using the user account used for your build agent. Remark 2: If possible I would recommend to avoid this approach and add a nuget.config to your solution or project instead. Add the reference to the package source in there. The advantage of doing that is th

Azure DevOps–Disable CI trigger

The default YAML pipeline Azure DevOps creates for you looks like this: This pipeline is triggered every time a change is pushed to the master branch as you can see in the ‘trigger’ section. But I wanted to trigger this pipeline only manually. To achieve this you need to update the pipeline and set the ‘trigger’ value to ‘none’: More information: Azure Pipeline Triggers

GraphQL- Schema Stitching Handbook

Although a lot of attention goes to GraphQL Federation , GraphQL Schema Stitching remains a powerful alternative. For everyone new to GraphQL Schema Stitching I would recommend the Schema Stitching Handbook . It shows a lot of examples on what is possible through Schema Stitching: Combining local and remote schemas Single-record type merging Array-batched type merging Nullable merges Cross-service interfaces Merged types with multiple keys Computed fields

Kubernetes- How adding healthchecks made our applications less resilient

When using a container orchestrator like Kubernetes it is recommended to add health check probes to your application. These health probes can be used to check the app’s status and help the container orchestrator decide when to restart a container, start sending traffic, … So we decided to use the Microsoft.AspNetCore.Diagnostics.HealthChecks package to add a healthcheck endpoint to our ASP.NET Core applications: Inside the health check, we test our app dependencies to confirm availability and normal functioning. So far, so good… Unfortunately we went a little bit too far with this approach which got us into trouble. What did we do wrong? Therefore I need to explain a little bit about our architecture. We use an API gateway as the single entry point for all of our services: This API gateway is just another ASP.NET Core Application that is using GraphQL schema stitching to bring all our API’s together in one logical schema. And of course we also added a healthcheck endpoi

IIS–Failed Request Tracing

One of the ways to monitor your IIS traffic is through ‘Failed Request Tracing’. This feature allows you to trace the full request pipeline and capture all the details. Failed-request tracing is designed to buffer the trace events for a request and only flush them to disk if the request "fails," where you provide the definition of "failure". If you want to know why you're getting 404.2 error messages or requests start hanging, failed-request tracing is the way to go. Installing Failed Request Tracing Failed Request Tracing is not available out of the box in IIS but could be installed as a part of the ARR(Application Request Routing) feature. ARR can be installed directly from here or through the Web Platform installer when available in IIS. After the installation is completed, the Failed Request Tracing feature becomes available in IIS. But before you can use it, you need to enable  it. If Failed Request Tracing is still not available after installi

Serilog–Add headers to request log

By default logging in ASP.NET Core generates a lot of log messages for every request. Thanks to the Serilog's RequestLoggingMiddleware that comes with the Serilog.AspNetCore NuGet package you can reduce this to a single log message: But what if you want to extend the log message with some extra data? This can be done by setting values on the IDiagnosticContext instance. This interface is injected as a singleton in the DI container. Here is an example where we add some header info to the request log:

Tools and resources for Agile forecasting

As a developer or teach lead sooner or later you will be asked to estimate. Of course you could just throw out a ballpark figure but better is to use some historical data about your team performance. In that case I would recommend having a look at the online calculators and forecasting spreadsheets created by Focused Objective. They provide a lot of tools but also articles that can help you answer different questions about the current and future performance of your team.

Docker diff

When investigating a problem with a docker file I wanted to check what was changed when running the docker image. I first had to lookup the container id through docker ps : C:\Users\bawu>docker ps CONTAINER ID   IMAGE                          2ca085df3487   masstransit/rabbitmq:latest   Now I could ran docker diff <CONTAINER> using the container ID  to see the files that are changed: : C:\Users\bawu>docker diff 2ca085df3487 C /var C /var/log C /var/log/rabbitmq A /var/log/rabbitmq/log A /var/log/rabbitmq/log/crash.log C /etc C /etc/rabbitmq A /etc/rabbitmq/rabbitmq.conf

Build your ASP.NET Core application outside the Docker container

Most of the examples you find when using an ASP.NET Core application inside a Docker container use the multistaged build approach. In this approach you create a dockerfile where building the application happens inside the docker file, the output of this build will then be used in a second stage to create the final docker image: This is fine and results in small optimized docker images. So why this blog post? The problem becomes clear when you take a look at our build pipeline: What you can see is that we build the application twice; one time to run the unit tests, code analysis, vulnerability scanning etc… and one time to produce the docker image. Although we use different stages in Azure pipelines, it is still a waste of resources. An alternative approach is to build your ASP.NET Core application outside the Docker container. The dockerfile is only used to copy the build artifacts from the publish folder into the docker image. More information:

C# 9 Switch Expressions with Type patterns

C# 9 allows you to combine the power of pattern matching with switch expressions. I had a use case where I had to check the type of an object and depending on the type execute different logic. Before C# 7 type checks where not possible, so although I wanted to write the following switch statement, this would not compile: In C#7 Type pattern support was added so then I could write the following C#9 further improves this by the introduction of switch expressions. Now I could handle this use case like this: Neat!

Azure DevOps Pipelines - The OutputPath property is not set for project

When trying to build a csproj file using Azure DevOps pipelines, it failed with the following error message: The OutputPath property is not set for project The important thing to notice here is that this only happens when pointing the build task to a csproj file instead of a sln file. Turns out that there is a small difference between how the platform variable is configured at the solution level vs the project level. These are the variable settings we use inside the build task: $(BuildConfiguration)= “Release” $(BuildPlatform)=”Any CPU” When I took a look at the csproj file it was expecting “AnyCPU” as the platform setting not “Any CPU”. (Notice the space) I fixed it by setting the BuildPlatform to “AnyCPU” and not use the build variable for this specific task.