Skip to main content


Application Insights–OpenTelemetry integration for ASP.NET Core

OpenTelemetry is becoming THE standard of telemetry instrumentation. And thanks to the Azure Monitor OpenTelemetry Exporter we can keep using Application Insights as our tool of choice. In this post I'll walk you through the steps to link your OpenTelemetry enabled ASP.NET core application to Azure Monitor Application Insights. Important to mention is that this feature is still in preview! At the moment of writing distributed tracing and metrics are supported but some of the other features that we like from the Application Insights SDK are not available(yet). What is not supported? I copied the list from the documentation : Live Metrics Logging API (like console logs and logging libraries) Profiler Snapshot Debugger Azure Active Directory authentication Autopopulation of Cloud Role Name and Cloud Role Instance in Azure environments Autopopulation of User ID and Authenticated User ID when you use the Application Insights JavaScript SDK
Recent posts

Switch between node versions on Windows

By default, you have 1 node.js and NPM version installed on your machine. However, if you have to maintain applications written in multiple Angular and/or Typescript versions, this can become a problem. Older Angular/Typescript applications are not compatible with the latest node.js version and can no longer be compiled. In Linux, this has been solved by the introduction of nvm , the node version manager. With it, it is possible to quickly install(and uninstall) different node.js versions and easily switch between them via the command line. NVM-Windows With nvm-windows , an alternative is also available for Windows. You can easily install nvm-windows on your local machine via the installer available here: . Using nvm-windows is very simple: Install a node.js version: nvm install <version> Listing the installed versions: nvm list Switching between versions: nvm use <version> No

Central Package Management - warning NU1507: There are 2 package sources defined in your configuration.

A few weeks ago, I talked about a new NuGet feature, Central Package Management . This allows you to manage your dependencies at the solution level instead of at the project level. After converting one of my projects to use Central Package Management, I noticed the following warning in Visual Studio: warning NU1507: There are 2 package sources defined in your configuration. When using central package management, please map your package sources with package source mapping ( ) or specify a single package source. The following sources are defined:, I get this warning because I have 2 package sources defined, one at the machine level and one at the solution level in my nuget.config. I could get rid of this warning by using a single package source.Therefore I need to update my nuget.config file to have only 1 package source defined and add a <clear /> statement to not use package sources found at other level

There is no such thing as a requirement

In software development we are used to the term ‘ requirement '. I don't like this term and I will explain why. Using the word requirement puts a strong emphasis on 'required'; 'this is something that is required' or it is 'something the business MUST have to succeed' . But by describing it as required, there is no room for discussion. Is this really what the business needs? Does it make sense? Or is there maybe a better solution?  What if we can achieve the same outcome in a different way? If it really was required, the business would not exist or even function. So by definition what the business asks for cannot be a requirement, a necessity. There is no such thing as a requirement When I hear the word requirement, I always have to think about the following quote: “If I had asked people what they wanted, they would have said faster horses.” Although there is no evidence that Henry Ford ever said those words, I believe he was certainly thi

Cleanup old MetricBeats data

At one of my clients we are using MetricBeats to monitor our RabbitMQ cluster . Of course this can result in a lot of data over time. Recently I was called by one of the system administrators asking why the disk was filling up on the servers hosting ElasticSearch. Let’s find out together… Index lifecycle policies To keep the amount of data on your ElasticSearch cluster under control, you can configure a lifecycle policy . A lifecycle policy moves data through multiple phases; A hot phase: used for your most recent most frequently searched data. It provides the highest indexing AND search performance but comes with the highest cost A warm phase: optimized for search performance over index performance. In this phase it is expected that the data doesn’t change that often anymore. A cold phase: optimized for cost saving over search performance. In this phase the data is read-only. A delete phase: deletes the data you longer need Let’s see how to configure a lifecy

GraphQL–Discriminated unions for input types through OneOf

Today I want to show you how you can use discriminated unions for input types in GraphQL through the OneOf specification. Disclaimer: The OneOf specification is still in draft, so it is possible that it will still change or not become a part of the GraphQL specification. Let me first start by explaining the use case where I want to apply this features. In our GraphQL api , we have multiple root queries that can be queried either through the technical id(an integer) and the business key(a string). Our original API looked something like this: A consumer of this API should either use the id or the business key but I could not prevent the user from using both: To handle this situation I throw an error when the user tries to invoke the GraphQL query providing both arguments: This is a perfect case that can be improved through the usage of OneOf. By using OneOf on an input object we tell GraphQL that only one field of the input object can be set. In a schema first approac

Property based testing in C#–Part 4

In this multipart blog post I want to introduce you in the world of property-based testing and how to do this in C#. In the first part , I gave you an introduction on what property-based testing is, why it useful and how it can help you write better code. In the second post , I showed you a concrete example on how to start writing property-based tests in C# using FsCheck . In a third post I explained how property-based tests can help  to find edge cases and to understand a codebase better. In this post I continue the journey by having a look at how to write our own generators. If you didn’t read my previous post, generators are the tool that helps you to select a value from an interval. For some of the available types in .NET, a default generator (and shrinker) exists but sometimes it is necessary to create your own generators. Create your own FsCheck generator in C# Creating your own generator for FsCheck is easy in C#, the only thing you need is a public static class with a