Tuesday, November 24, 2020

Snapper - A snapshot does not exist

The first time I ran Snapper it failed with the following error message:

    Snapper.Exceptions.SnapshotDoesNotExistException : A snapshot does not exist.

    Apply the [UpdateSnapshots] attribute on the test method or class and then run the test again to create a snapshot.

  Stack Trace:

    SnapshotAsserter.AssertSnapshot(SnapResult snapResult)

    Snapper.MatchSnapshot(Object snapshot, SnapshotId snapshotId)

    Snapper.MatchSnapshot(Object snapshot, String childSnapshotName)

    Snapper.MatchSnapshot(Object snapshot)

    SnapperExtensions.ShouldMatchSnapshot(Object snapshot)

    GraphQLTests.Queries_Succeed(String queryName) line 57

    --- End of stack trace from previous location where exception was thrown ---

This error makes sense as Snapper expects that a snapshot exists when you run your tests. To fix this, you need to update your tests and add the [UpdateSnapshots] attribute above your test method or class.

When Snapper finds this attribute it will not search for an existing snapshot but create a new one(or replace an existing one). After you have run your tests for the first time you can remove the attribute.

Don’t forget to check in the generated files in the ‘_snapshots’ folder to your source control system:

Monday, November 23, 2020

Using XUnit Theory with Snapper

To test my GraphQL schema I first used SnapShooter.

From the documentation:

Snapshooter is a flexible .Net testing tool, which can be used to validate all your test results with one single assert. It creates simply a snapshot of your test result object and stores it on the file system. If the test is executed again, the test result will be compared with the stored snapshot. If the snapshot matches to the test result, the test will pass.

It is really a good fit to test your GraphQL API. Unfortunately I got into trouble when I tried to combine it with XUnit Theory. I had created 1 tests that loaded multiple graphql files and validates them:

This didn’t work as expected because the same snapshot is used for every test run. As a consequence it fails when the same test runs again with a different parameter value .

Probably there is a way to fix it, but I was lazy so I took a look online and found Snapper, which offers similar features as SnapShooter but supports the XUnit Theory functionality through a feature called child snapshots. Let’s update our example:

Friday, November 20, 2020

Enable .NET 5 in your Azure App Service

After deploying a .NET 5 application in an Azure App Service, I got the following error message when I tried to run the application:

HTTP Error 500.31 - ANCM Failed to Find Native Dependencies
Common solutions to this issue:
The specified version of Microsoft.NetCore.App or Microsoft.AspNetCore.App was not found.

This is because in Azure App Service .NET 5 is not enabled by default (yet).

Let’s see how to fix this:

  • Open the Azure portal
  • Go to the App Service you want to configure
  • Click on Configuration in the Settings section

  • Go to the Stack Settings and change the .NET Framework version to .NET 5(Early Access)

  • Click Save
  • Restart the App Service

Thursday, November 19, 2020

.NET 5–Source Generators–Lessons Learned–Part 3

One of the new features in .NET 5 that triggered my curiosity where source generators. I did some investigation on how to use them and want to share the lessons I learned along the way.

Yesterday I got my first source generator finally up and running. Now it is time to move on and do something more interesting. I found an example on Github that created a strongly typed config class based on your appsettings.json. I tried to duplicate the code but when I build my application the source generator didn’t run.

In the build output I noticed the following warning:







Suppression State

Detail Description



Unable to load Analyzer assembly c:\lib\netstandard2.0\System.Text.Encodings.Web.dll: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.




System.IO.FileNotFoundException: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.File name: 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll' ---> System.IO.DirectoryNotFoundException: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)   at Roslyn.Utilities.FileUtilities.OpenFileStream(String path)   at Roslyn.Utilities.FileUtilities.OpenFileStream(String path)   at Microsoft.CodeAnalysis.Diagnostics.AnalyzerFileReference.GetAnalyzerTypeNameMap(String fullPath, AttributePredicate attributePredicate, AttributeLanguagesFunc languagesFunc)   at Microsoft.CodeAnalysis.Diagnostics.AnalyzerFileReference.Extensions`1.GetExtensionTypeNameMap()   at Microsoft.CodeAnalysis.Diagnostics.AnalyzerFileReference.Extensions`1.AddExtensions(Builder builder)-----System.IO.DirectoryNotFoundException: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)   at Roslyn.Utilities.FileUtilities.OpenFileStream(String path)-----

The source generator is using System.Text.Json and System.Text.Encodings.Web to parse the config files. But it seems that the source generator cannot find and use these references.

How to use external references in source generators?

There is some extra work that needs to be done before external libraries can be loaded successfully.

For every reference that your source generator needs you need to do 2 things:

  1. Take a private dependency on the package(PrivateAssets=all). By doing this consumers of the generator will not reference it.
  2. Set the dependency to generate a path property by adding GeneratePathProperty="true". This will create a new MSBuild property of the format PKG<PackageName> where <PackageName> is the package name with . replaced by _.

Now we can then use the generated path property to add the binaries to the resulting NuGet package as we do with the generator itself:

More information in the source generators cookbook:  https://github.com/dotnet/roslyn/blob/master/docs/features/source-generators.cookbook.md#use-functionality-from-nuget-packages

Wednesday, November 18, 2020

.NET 5–Source generators–Lessons learned–Part 2

One of the new features in .NET 5 that triggered my curiosity where source generators. I did some investigation on how to use them and want to share the lessons I learned along the way.

Yesterday I wrote my first generator. Unfortunately nothing happened and no new code was generated. I tried to attach the debugger but when I did a rebuild still nothing happened. It seems that the code itself was never called.

It took me some time to figure out why it didn’t work; it turns out that you have to add your source generator to a separate assembly(which makes sense). In my first try I just added the source generator logic to the same project as the other code but this doesn’t work.

How to package a source generator?

To get source generators up and running, you need to create a separate project add the source generator logic there.

So create a new .NET Standard project and add a reference to the following package:

Generators can be packaged using the same method as an Analyzer would. Therefore the generator should be placed in the analyzers\dotnet\cs folder of the package for it to be automatically added to the users project on install. To enable this we need to add the following to your project file:

Now we can add our source generators here.

There is one last step you also need to take into account. If you just create a reference to your SourceGenerator project it will not work. You have to change the project reference and include `OutputItemType="Analyzer" ReferenceOutputAssembly="false"`:

Now the source generator is finally invoked when I build my code.

Tuesday, November 17, 2020

.NET 5–Source generators–Lessons learned - Part 1

One of the new features in .NET 5 that triggered my curiosity where source generators. I did some investigation on how to use them and want to share the lessons I learned along the way.

But let’s first start with an explanation what “a source generator” actually is:

A Source Generator is a new kind of component that C# developers can write that lets you do two major things:

  1. Retrieve a Compilation object that represents all user code that is being compiled. This object can be inspected and you can write code that works with the syntax and semantic models for the code being compiled, just like with analyzers today.
  2. Generate C# source files that can be added to a Compilation object during the course of compilation. In other words, you can provide additional source code as input to a compilation while the code is being compiled.

So in short, source generators are a new compiler feature that allow you to inspect existing code and generate new code(remark: you cannot change existing code).

I started with the example mentioned in the blog post above:

But when I build my project, nothing happened and it looked like no code was generated. What was I doing wrong?

How to debug a source generator?

So the first question became how can I debug a source generator? Now I had no clue how to investigate what was (not) happening.

The trick is to add a `Debugger.Launch()` statement inside your code:

Now when you build your code a popup should appear that asks you how you what debugger you want to launch.

Monday, November 16, 2020

GraphQL Hot Chocolate–Enable global authentication

There are multiple ways to enable authentication in Hot Chocolate. Here is simple approach:

Step 1 – Enable ASP.NET Core authentication

First step is to enable authentication at ASP.NET Core level. Let’s use JWT token for authentication:

Step 2- Enable authentication at the root GraphQL query

The second(and already the last step) is to enable authentication on the root query type. By providing no role or policy names we’re simply saying the user must be authenticated.

Friday, November 13, 2020

Visual Studio 2019 16.8– Git Amend

With the release of Visual Studio 2019 16.8 Git became the default source control experience. It no longer lives inside Team Explorer but got its own menu item.

Read the announcement for more details.

One of the things I noticed when using the new Git experience was the first class support for ‘git amend’. But what does ‘git amend’ do?

Let me explain…

The git commit --amend command is a convenient way to modify the most recent commit. It lets you combine staged changes with the previous commit instead of creating an entirely new commit. Amending does not just alter the most recent commit, it replaces it entirely, meaning the amended commit will be a new entity with its own ref. It is a way to rewrite the git history.

Warning: Avoid amending a commit that other developers have based their work on. To be safe only amend local commits.

Thursday, November 12, 2020

Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported

With the release of .NET 5 it was finally time to try record types. Here is my first attempt that I copied from the documentation:

Unfortunately this turned out not the success I was hoping for. Visual Studio returned the following error message:







Suppression State



Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported





The error is not very descriptive but the reason I got it because I forgot to update the project to .NET 5. The fix was easy; after updating the target framework the error disappeared:

CS8632 - The annotation for nullable reference types should only be used in code within a ‘#nullable’ annotations context.

I copy/pasted the following code from an example I found on the web to test source generators(but that is not where this post is about today).

Building this code resulted in the following warning:

Do you have any idea why?

Take a look at the ‘Name’ property. Notice the usage of ‘?’.  This is part of the introduction of nullable reference types and declares the string as nullable.

Nullable reference types are available beginning with C# 8.0, in code that has opted in to a nullable aware context. This is what the warning message is referring to. We need to enable nullable reference type support in our project.

This can be done by adding <Nullable>enable></Nullable> in our csproj file:

Wednesday, November 11, 2020

The target framework 'netcoreapp3.0' is out of support and will not receive security updates in the future.

After upgrading to the latest Visual Studio version(16.8.1) that added support for .NET 5, I got a new warning when opening an existing .NET 3.0 solution:







Suppression State



The target framework 'netcoreapp3.0' is out of support and will not receive security updates in the future. Please refer to https://aka.ms/dotnet-core-support for more information about the support policy.


C:\Program Files\dotnet\sdk\5.0.100\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.EolTargetFrameworks.targets


You don’t have to panic when you see this warning. The only thing you need to be aware of is that some versions of .NET have a short support lifecyle where other versions are LTS (Long Term Support) versions. .NET Core 3.0 is a release with a short support lifecycle(support ended March 3 2020) where .NET Core 3.1 is an LTS version.

To get rid of this warning upgrade your app to .NET Core 3.1.

Remark: Be aware that .NET 5 is also a version with short support. Support will end with the release of .NET 6 (planned for February 2022).

Tuesday, November 10, 2020

ORM- Attaching detached object trees

One of the nice features of an ORM is that it allows you to change a complete object tree and that the ORM will figure out what needs to be added, updated and deleted. The reason why this works is because a typical ORM will track by default all objects that are attached to the ORM abstraction(DBContext in EF Core and ISession in NHibernate). Because it tracks these objects (and their relations) it knows what should happen when you make a change.

Unfortunately when you are sending objects to a client through an API(typically using DTO’s and tools like Automapper) and you get a changed object tree back, the objects in this tree are no longer attached to the ORM, so it doesn’t know what should be inserted, updated or deleted. Luckily most ORM’s are smart and use different tricks to figure out what is changed.

However one place where these ORM’s typically fail is when you are using Automapper and mapping a child collection in your DTO’s to your domain model. An example below where we are FluentNHibernate:

Imagine that we have an existing Order entity with  an OrderDetail collection with following data:

{ Id: 22, OrderId: 49 }

And an OrderDetail DTO arrives with the following data:

{ Id: 22, OrderId: 49 }

{ Id: 23, OrderId: 49 }

This should result in an INSERT operation as 1 new OrderDetail is added. Instead this fails with the following error message:

NHibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session: 22, of entity: OrderDetail

The reason why this fails is because Automapper by default creates a new collection when mapping from one type to another. This confuses NHibernate and makes him think that both OrderDetails are new and should be added. (A similar problem exists when using Entity Framework)

A possible fix is to use Automapper.Collection. This will Add/Update/Delete items from a preexisting collection object instead of creating a new collection.

To enable this add the Nuget package to your project and update the Automapper configuration:

Monday, November 9, 2020

Microsoft Orleans- Warning NoOpHostEnvironmentStatistics

When starting up my Orleans cluster I noticed the following warning message:

warn: Orleans.Runtime.NoOpHostEnvironmentStatistics[100708]

      No implementation of IHostEnvironmentStatistics was found. Load shedding will not work yet

To get rid of this warning you need to include the following line in the SiloBuilder config:

This will enable support for Load shedding. This kicks in when the system gets overloaded and prevents new clients from connecting.

Friday, November 6, 2020

Kafka - Avro - Value serialization error

The default serialization format for Kafka is Avro. I mentioned how to use this in C# yesterday.

Today I got into trouble when trying to send a specific message.

I changed the example a little bit to explicitly point out the issue. You see in the code above that I set the value for ‘Browser’ to ‘null’.

When trying to send this message it failed with the following error:

Local: Value serialization error

Let’s have a look at the related avro schema:

The problem is that in the schema is specified that the Browser field should a value of type string. ‘Null’ is not a valid value for string. This explains why it fails.

To solve this I have two options;

1) Either change the code to send an empty string instead of null:

2) Either update the schema to allow null values for the browser field:

More about Avro: https://en.wikipedia.org/wiki/Apache_Avro

Thursday, November 5, 2020

Kafka- Using Avro as serialization format in C#

To help you with using Avro as the serialization format for your Kafka messages, a .NET core global tool avrogen is available.

  • First install the tool using dotnet tool install:
  • Next step is to specify your message schema. Therefore you need to create an .avsc file and add your message specification:
  • Now it’s time to generate the necessary code:
  • This will generate the following:

The generated type can than be used by your Producer and Consumer logic.

More information: https://www.confluent.io/blog/avro-kafka-data/

    Wednesday, November 4, 2020

    DDD–Strongly typed Ids-Using C#9 Record Types

    I blogged before about the usage of Strongly typed Ids in your domain models. With the introduction of C#9 record types, an alternative approach becomes an option.

    Record types are reference types with built-in immutability and value semantics. They automatically provide implementations for Equals, GetHashCode, etc, and offer a very concise syntax known as positional records. This allows us to rewrite our ProductId type using records:

    public record ProductId(int Value);
    That’s all that is needed.

    Thank you C# 9!

    Tuesday, November 3, 2020

    Sending a message through Kafka - Value serializer not specified and there is no default serializer defined for type

    My first attempt to send a typed message through Kafka resulted in the following error message:

    Value cannot be null. (Parameter 'Value serializer not specified and there is no default serializer defined for type PageViewEvent)

    Here is the code I was using:

    As the error message mentions, you need to explicitly specify what serializer should be used for your message object. Therefore you need to use the SchemaRegistryClient and specify a serializer(I’m using Avro in the sample below):

    Monday, November 2, 2020

    Configure a Kafka topic in C#

    By default when you use the Confluent Kafka .NET client, a topic is created automatically for you when you publish your first message.

    However this will create a topic using the default settings. Typically you want to have more control when creating a topic. This is possible through the usage of the AdminClient:

    Friday, October 30, 2020

    Azure DevOps Docker build error–No such file or directory

    When trying to build a new .NET core project using Azure DevOps it failed with the following error message:

    Step 6/16 : COPY ["DockerWebApplication1/DockerWebApplication1.csproj", "DockerWebApplication1/"]
    COPY failed: stat /var/lib/docker/tmp/docker-builder590338138/DockerWebApplication1/DockerWebApplication1.csproj: no such file or directory
    ##[error]COPY failed: stat /var/lib/docker/tmp/docker-builder590338138/DockerWebApplication1/DockerWebApplication1.csproj: no such file or director
    The problem is that the COPY command is using the wrong folder to COPY from. To fix it, update your Docker Build task:
    • Uncheck the “Use Default Build Context” checkbox
    • Set  the “Build Context” field to "$(System.DefaultWorkingDirectory)"

    Thursday, October 29, 2020

    Software development is learning

    In every software project you have 2 sides: the problem space and the solution space. The problem space is typically owned by a product owner, he or she knows the problem they want to solve. The solution space is owned by the developers, they know how to solve those problems.

    Easy right? Unfortunately it isn’t so easy as it sound. Product owners don’t always succeed in explaining the problem (or describe a workaround instead of the real problem). As a result, developers don’t always get it right the first time and build the wrong solution.

    The most important lesson you can take with you on any software project is that is all about learning. As you are learning nobody expects that you get it right from the first time. Instead somebody points you your mistakes and gives you feedback.

    So how can we become better in writing software? Not by typing faster and certainly not by making no mistakes.

    We become better by having better feedback faster. The key is shorter feedback loops; the compiler who tells you you’ve made a mistake, a smoke test that immediately warns you that there is a problem with a build, a user that gives feedback when you demonstrate a new feature, …

    Wednesday, October 28, 2020

    Tuesday, October 27, 2020

    HotChocolate - Apollo Tracing

    To investigate what’s going on inside your GraphQL backend I would recommend to enable Apollo tracing. You can always enable this but as this has a (small) performance impact, a better way is to enable it on demand.

    In HotChocolate you can specify this through the QueryExecutionOptions:

    After doing that, tracing will be activated when ad GraphQL-Tracing HTTP header with a value of 1 is added to the request.

    Now your GraphQL result will not only contain the data but also tracing information. GraphQL tools like GraphQL Playground can visualize this data:

    If the tracing information is not shown, check your GraphQL Playground settings and see if "tracing.hideTracingResponse" is set to false.

    Monday, October 26, 2020

    SonarCloud–New Code Definition

    When scanning a project for the first time through SonarCloud in your Azure DevOps build pipeline, a new SonarCloud project is created automatically.

    The second time your build runs a comparison is done and you’ll have your first Quality Gate results. At least that is the theory…

    When we tried to do this for a new project, the corresponding SonarCloud project was created. However we got the following warning when browsing to this project in SonarCloud:

    This is a new requirement for every SonarCloud project to specify how ‘new code’ is identified. You can choose between the following options:

    • Previous version: All code that has changed since the previous version
    • Number of days: All code that has changed in the last x days
    • Specific date: All code that has changed since the specified date

    It can be set in the organization settings(Organization –> Administration –> New Code) and applies to all new projects.

    Unfortunately setting this didn’t resolve the issue for this already existing project. I couldn’t find a way to get rid of the error so in the end I solved it by throwing away and recreating the project.

    Friday, October 23, 2020

    Learning Docker in a month

    If you are new to Docker, I can recommend the Learn Docker in a Month of Lunches YouTube series.

    It will cost you your lunch break but that is a small cost to pay…

    Thursday, October 22, 2020

    Azure DevOps - Allow users to edit process templates

    I got the following question from a colleague;

    “Is it possible to give all team members the necessary rights to edit a specific process template?”

    Let’s see how we can do this:

    • Scroll to the Process section on the left and click on it.

    • Click on the next to the Process template you want to use and click on the Security Item in the context menu.

    • Search for the team you want to add. After adding it, change the Edit process permission to “Allow”.

    • That’s it!

    Wednesday, October 21, 2020

    Entity Framework Core –Pessimistic concurrency

    After blogging about pessimistic concurrency and the usage of locks in NHibernate yesterday, I wanted to write a follow up on how to do the same thing in Entity Framework Core.  Turns out that EF Core does not support Pessimistic Concurrency…

    However this does not mean it is not possible. We’ll have to throw some raw SQL in the mix to get it working.

    First we need to explicitly create a database transaction:

    After doing that we need to set the lock. We do this by creating a ‘fake’ update statement with a lock statement:

    Now we can execute our ‘real’ update using EF Core:

    Here is the full code example:

    DDD–Strongly typed Ids

    One of the core principles of DDD is the usage of value objects to avoid “primitive obsession”. "Primitives" refer to the built-in types in C#,  int, string,guid etc. "Primitive obsession" refers to over-using these types to represent domain concepts that aren't a perfect fit. Some examples are a HouseNumber that is represented by an int or an EmailAddress that is represented by a string.

    This concept not only makes sense for your value objects but is also valuable for your Entity and AggregateRoot id’s. A ProductId should not be interchangeable with an OrderId.

    Creating a valueobject for every Id type is not that hard but remains cumbersome. Let’s introduce StronglyTypedId as a solution.

    From the website:

    StronglyTypedId makes creating strongly-typed IDs as easy as adding an attribute! No more accidentally passing arguments in the wrong order to methods - StronglyTypedId uses Roslyn-powered build-time code generation to generate the boilerplate required to use strongly-typed IDs.

    Getting started

    • To get started, first add the StronglyTypedId nuget package to your project: https://www.nuget.org/packages/StronglyTypedId/
    • Now create a struct type and add the StronglyTypedId attribute on it.
      • Notice that we specify to not generate a JsonConverter. If you want to serialize the type you need to add an extra reference to Newtonsoft.Json or System.Text.Json.
    • Build your project. Let’s have a look at what is generated:
    • By default a Guid is used as the backing field. If you want to use a different type, you can specify this:

    Tuesday, October 20, 2020

    NHibernate–Pessimistic concurrency

    In most cases locking is something that you want to avoid as it limits the level of concurrency of your database. But sometimes that is exactly what you want.

    You can use pessimistic concurrency in NHibernate by using an overload that takes a LockMode:

    When using session.Get<T>:

    Or when using session.Query<T>:

    Microsoft MakeCode

    If you want to learn your children the joy of programming, have a look at Microsoft MakeCode.

    From the website:

    Microsoft MakeCode is a free, open source platform for creating engaging computer science learning experiences that support a progression path into real-world programming.

    It contains a lot of simulators, tutorials and examples for multiple devices and platforms.

    Monday, October 19, 2020

    Azure DevOps– Lead time and cycle time

    Both lead time and cycle time can be measured and visualized in Azure DevOps.

    Lead time is calculated from work item creation to entering a completed state. Cycle time is calculated from first entering an In Progress state to entering a Completed state as visualized below:

    The easiest way to get this data is by using the Cycle Time and Lead Time widgets.

    • To enable these widgets, go to the Dashboards page in your Azure DevOps project.

    • Click on Edit on the right.
    • In the Add Widget search box, enter ‘cycle’ to search for the Cycle Time widget. Click on Add to add it to the dashboard

    • On the Configuration page, you can select the team, backlog level and time period. You also have the option to further filter the backlog using field criteria.

    • Click on Save and Close to apply the configuration.

    • Repeat the steps above for the Lead Time widget.

    More information: https://docs.microsoft.com/en-us/azure/devops/report/dashboards/cycle-time-and-lead-time?view=azure-devops

    HotChocolate GraphQL - Integration test authorization

    The HotChocolate blog gives some guidance on how to write integration tests. This is a good starting point but doesn’t help you get to a final solution when you are using authorization in your GraphQL schema.

    In that case you need a way to inject an authenticated ClaimsPrincipal into the GraphQL middleware. It took us some time to figure out a solution but here are the steps involved:

    Create custom middleware:

    Register middleware:

    Friday, October 9, 2020

    GraphQL Altair–Test and debug your GraphQL queries

    As the GraphQL ecosystem keeps growing I’m continually founding new tools that make my job as a GraphQL developer easier. Recently I started using Altair, a GraphQL client application that makes it easy to test your GraphQL endpoints.

    A selection of the features it has to offer:

    • Multi language/platform/window
    • Import and export queries
    • File upload
    • Syntax and query highlighting
    • Autofill all fields

    Thursday, October 8, 2020

    Quick tip if you want to play around with C# 9

    The easiest way to start playing with C# 9 is by installing LinqPad 6. This allows you to play with the new language features without installing any extra dependencies.

    To enable the new language features, go to Edit –> Preferences.

    Go to the Query tab and set a checkbox next to Enable C# 9 preview features:

    Wednesday, October 7, 2020

    C#9- Record types

    One of the most anticipated features coming in C#9 are Record types. Record types make it easy to create immutable reference types in .NET. They become the perfect fit for Value Objects in DDD terms.

    A new keyword is introduced to support this: record. The record keyword makes an object immutable and behave like a value type.

    An example:

    You can even write this using a shorter syntax using positional records:

    But what if you want to change the person name? How can we do this knowing that the object is immutable? Do I need to copy all properties into a new object?

    Let’s introduce the with keyword to fix this. It allows you create an object from another by specifying what property changes:

    Tuesday, October 6, 2020

    C#9–Covariant return types

    C# 9 adds a whole list of new language features. Most people talk about the support for record types and improved pattern matching but one of the features I’m happy that is finally added to the language are covariant return types.

    You are probably asking covariant return what?????

    Let’s explain; with return type covariance, you can override a base class method that has a less-specific type with one that returns a more specific type.

    An example, before C#9 you would have to return the base type from the inherited class:

    In C# 9, you can do the following:

    Monday, October 5, 2020

    RabbitMQ–Lazy queues

    By default RabbitMQ tries to keep your whole queue in memory. This is OK as long as your messages are processed fast enough but not if your queue becomes very long(many millions of messages). Queues can become very long for various reasons:

    • consumers are offline / have crashed / are down for maintenance
    • there is a sudden message ingress spike, producers are outpacing consumers
    • consumers are slower than normal

    Lazy Queues can help in these situations- messaged are moved to disk as early as practically possible, and are only loaded in RAM when requested by consumers.This comes at a cost of increased disk I/O.

    You can configure this in MassTransit when configuring your receive endpoint:

    More information: https://www.rabbitmq.com/lazy-queues.html

    Friday, October 2, 2020

    Database profiling in Visual Studio

    Visual Studio 2019 version 16.3 extends the Performance Profiler with a new ‘Database’ option.  This will enable the new Database tool that captures all database activity during your profiling session.

    Remark: The database tool works with .NET Core projects  using either ADO.NET or Entity Framework Core. It also works on traces collected using dotnet trace which means we can collect data anywhere that .NET Core runs (including Linux!) and analyze that data in Visual Studio.

    Let’s try it:

    • Open the Performance Profiler in Visual Studio by clicking Debug > Performance Profiler.
    • Select the checkbox next to “Database” to enable the tool.

    • Click on Start to start the profiling session. Now interact with your application in the ways you’re interested in investigating. When you are done click ‘Stop collection’.

    • Now you get a table of all the queries that happened during your profiling session along with a graph that shows when and how many queries happen over time.

    • After identifying a query that warrants further investigation, you can go to the related code by right-clicking on a row, and selecting “Go To Source File”!

    Thursday, October 1, 2020

    Improve the startup time of your .NET Core application

    Starting from .NET Core 3.x tiered compilation is enabled by default. This allows you to use precompiled code from assemblies created using the ReadyRoRun(R2R format. R2R is a form of ahead-of-time (AOT) compilation.

    It improves startup performance by reducing the amount of work the just-in-time (JIT) compiler needs to do as your application loads.

    To use this feature you need to enable ReadyToRun and publish your application as a self-contained app.

    1. Add the <PublishReadyToRun> setting to your project:


    2. Publish a self-contained app. Here is an example targetting the Linux ARM64 runtime:

      dotnet publish -c Release –r linux-arm64 --self-contained

    Wednesday, September 30, 2020

    .NET Core Plugin architecture

    A plugin architecture remains a valuable option to make your application extensible. With the introduction of AssemblyLoadContext and AssemblyDependencyResolver in .NET Core 3, creating and loading plugins became a lot easier.

    Still if you need more features I would recommend having a look at the DotNetCorePlugins project: https://github.com/natemcmaster/DotNetCorePlugins

    Usage is simple through the PluginLoader class:

    One nice feature it adds to the mix is hot reloading. This will allow you to dynamically update assemblies on the fly:

    Tuesday, September 29, 2020

    Azure DevOps–Dependency Tracker

    When working with multiple teams in Azure DevOps that have cross team dependencies, it can become quite hard to track what is going on.

    If you recognize this problem I would recommend having a look at the Dependency Tracker extension for Azure DevOps: https://marketplace.visualstudio.com/items?itemName=ms-eswm.dependencytracker&ssr=false#overview

    The Dependency Tracker extension provides you with the ability to plan and manage dependencies across teams, projects, and even organizations. It provides filterable views to show all dependencies a team is consuming and producing.

    You can see both producting and consuming dependencies:

    And there is even a timeline view available:

    The easiest way to manage your dependencies is to use the predecessor/successor link type to link work items together:

    Create links manually

    More information: https://docs.microsoft.com/en-us/azure/devops/boards/extensions/dependency-tracker

    Monday, September 28, 2020

    Angular 10–Strict mode

    With the release of Angular 10 a more strict project setup is available. You can enable this by adding a –strict flag when creating a new Angular project:

    ng new --strict

    This strict mode allows to have more build-time optimizations, help to catch bugs faster and improves overall maintainability of your Angular application.

    Therefore it applies the following changes:

    • Enables strict mode in TypeScript
    • Turns template type checking to Strict
    • Default bundle budgets have been reduced by ~75%
    • Configures linting rules to prevent declarations of type any
    • Configures your app as side-effect free to enable more advanced tree-shaking

    There is no out-of-the box option to enable this for an existing project(after creating it without the –strict flag or after upgrading from a previous version) but you can apply the same changes manually:

    • Add the following rules to your tsconfig.json:
    "compilerOptions": {
      "strict": true,
      "forceConsistentCasingInFileNames": true,
      "noFallthroughCasesInSwitch": true
    "angularCompilerOptions": {
      "strictInjectionParameters": true,
    "strictInputAccessModifiers": true, "strictTemplates": true }
    • Update the tslint.json:
    "no-any": true
    • Update the bundle budget sizes in your angular.json:
    "configurations": {
      "production": {
        "budgets": [
            "type": "initial",
            "maximumWarning": "500kb",
            "maximumError": "1mb",
            "type": "anyComponentStyle",
            "maximumWarning": "2kb",
            "maximumError": "4kb",
    • Add a schematics to your projects.[projectName].schematics path in the angular.json:

    schematics: {
      "@schematics/angular:application": {
        "strict": true
    More information: https://blog.angular.io/angular-cli-strict-mode-c94ba5965f63

    Friday, September 25, 2020

    Computer stuff they didn’t teach you…

    It is always fun to see Scott Hanselman in action. And in this video series he explains a lot of IT concepts in simple and concise way.

    A good way to end your work week…

    Thursday, September 24, 2020

    GraphQL–The query has non-mergable fields

    I constructed the following query in GraphQL playground:

    In this query I try combine 2 calls, one to fetch a product by id and another one where I try to fetch a product by name.

    However when I tried to execute this query it resulted in the following error message:

    The query has non-mergable fields

    The problem is that GraphQL tries to merge the results in one product result object which does not work.

    To have the behavior I want I need to specify an alias for every query:

    Now when I execute the query, the query results are put in their own object:

    Wednesday, September 23, 2020

    Error when running Sonar Cloud on Angular projects

    And the story continues…

    After setting up Sonar Cloud integration in Azure DevOps and applying a fix for .NET Core applications, I tried to do the same thing for our Angular code.

    It didn’t work as expected, this is the output I got:

    [More Information](https://sonarcloud.io/documentation/analysis/scan/sonarscanner-for-azure-devops/)


    D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe end

    SonarScanner for MSBuild 4.10

    Using the .NET Framework version of the Scanner for MSBuild

    Post-processing started.

    18:04:47.123 Fetching code coverage report information from TFS...

    18:04:47.125 Attempting to locate a test results (.trx) file...

    18:04:47.64 Looking for TRX files in: D:\a\1\TestResults

    18:04:47.64 No test results files found

    18:04:48.125 Did not find any binary coverage files in the expected location.

    18:04:48.127 Falling back on locating coverage files in the agent temp directory.

    18:04:48.128 Searching for coverage files in D:\a\_temp

    18:04:48.128 No coverage files found in the agent temp directory.

    ##[error]The SonarQube MSBuild integration failed: SonarQube was unable to collect the required information about your projects.

    Possible causes:

    1. The project has not been built - the project must be built in between the begin and end steps

    2. An unsupported version of MSBuild has been used to build the project. Currently MSBuild 14.0.25420.1 and higher are supported.

    3. The begin, build and end steps have not all been launched from the same folder

    4. None of the analyzed projects have a valid ProjectGuid and you have not used a solution (.sln)

    The SonarQube MSBuild integration failed: SonarQube was unable to collect the required information about your projects.

    Possible causes:

    1. The project has not been built - the project must be built in between the begin and end steps

    2. An unsupported version of MSBuild has been used to build the project. Currently MSBuild 14.0.25420.1 and higher are supported.

    3. The begin, build and end steps have not all been launched from the same folder

    4. None of the analyzed projects have a valid ProjectGuid and you have not used a solution (.sln)

    Generation of the sonar-properties file failed. Unable to complete SonarQube analysis.

    ##[error]18:04:48.176 Post-processing failed. Exit code: 1

    18:04:48.176 Post-processing failed. Exit code: 1

    ##[error]The process 'D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe' failed with exit code 1

    Finishing: Run Code Analysis

    Did you notice my mistake? As I’m building an Angular application, I shouldn’t be using MSBuild. Instead I need to use the stand alone scanner.

    Let’s fix this in the Prepare Analysis task:

    Tuesday, September 22, 2020

    Error when running Sonar Cloud on .NET Core projects

    Yesterday I talked about setting up Sonar Cloud for code analysis.

    Unfortunately it turned out to be not so simple as I expected. The first time I ran the build pipeline the Code analysis task failed with the following error:

    2020-09-14T17:47:52.6441174Z ##[section]Starting: Run Code Analysis

    2020-09-14T17:47:52.6563714Z ==============================================================================

    2020-09-14T17:47:52.6564026Z Task         : Run Code Analysis

    2020-09-14T17:47:52.6564329Z Description  : Run scanner and upload the results to the SonarCloud server.

    2020-09-14T17:47:52.6564583Z Version      : 1.15.0

    2020-09-14T17:47:52.6564794Z Author       : sonarsource

    2020-09-14T17:47:52.6565319Z Help         : Version: 1.15.0. This task is not needed for Maven and Gradle projects since the scanner should be run as part of the build.

    [More Information](https://sonarcloud.io/documentation/analysis/scan/sonarscanner-for-azure-devops/)

    2020-09-14T17:47:52.6565909Z ==============================================================================

    2020-09-14T17:47:52.9698863Z [command]D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe end

    2020-09-14T17:47:53.0469811Z SonarScanner for MSBuild 4.10

    2020-09-14T17:47:53.0470290Z Using the .NET Framework version of the Scanner for MSBuild

    2020-09-14T17:47:53.0984270Z Post-processing started.

    2020-09-14T17:47:54.3358218Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.shared\bridges.cms.modules.contentcreation.shared.csproj"

    2020-09-14T17:47:54.3359968Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.infrastructure\bridges.cms.modules.contentcreation.infrastructure.csproj"

    2020-09-14T17:47:54.3361069Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.domain\bridges.cms.modules.contentcreation.domain.csproj"

    2020-09-14T17:47:54.3362264Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.application\bridges.cms.modules.contentcreation.application.csproj"

    2020-09-14T17:47:54.3364005Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.dataaccess\bridges.cms.modules.contentcreation.dataaccess.csproj"

    2020-09-14T17:47:54.3365002Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.archtests\bridges.cms.modules.contentcreation.archtests.csproj"

    2020-09-14T17:47:54.3366234Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.integrationtests\bridges.cms.modules.contentcreation.integrationtests.csproj"

    2020-09-14T17:47:54.3367348Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.api\bridges.cms.api.csproj"

    2020-09-14T17:47:54.3368778Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.unittests\bridges.cms.modules.contentcreation.unittests.csproj"

    2020-09-14T17:47:54.3370405Z ##[error]No analysable projects were found. SonarQube analysis will not be performed. Check the build summary report for details.

    2020-09-14T17:47:54.3371973Z No analysable projects were found. SonarQube analysis will not be performed. Check the build summary report for details.

    2020-09-14T17:47:54.3431567Z Generation of the sonar-properties file failed. Unable to complete SonarQube analysis.

    2020-09-14T17:47:54.3479798Z ##[error]17:47:54.346  Post-processing failed. Exit code: 1

    2020-09-14T17:47:54.3481669Z 17:47:54.346  Post-processing failed. Exit code: 1

    2020-09-14T17:47:54.3668330Z ##[error]The process 'D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe' failed with exit code 1

    2020-09-14T17:47:54.3739014Z ##[section]Finishing: Run Code Analysis

    The problem is that SonarCloud expects a ProjectGuid to be able to differentiate between the different C# projects(.csproj files ) I have in my source repository. As this are .NET Core projects such a ProjectGuid no longer exist.

    To solve this problem I changed my dotnet build task to use the solution file instead of using csproj files:

    Monday, September 21, 2020

    Running code analysis through Sonar Cloud in Azure DevOps

    Sonar Cloud is the SaaS version of SonarQube, a static code analyzer. It can inspect your code against a set of quality standards, detect bugs, security vulnerabilities,  calculate technical debt and see how your code quality evolves over time.

    If you want to use it in Azure DevOps you should first install the SonarCloud extension from the marketplace: https://marketplace.visualstudio.com/items?itemName=SonarSource.sonarcloud

    After the extension is installed you get 3 new build tasks:

    Remark: Notice that this are not the same build tasks as should be used when using SonarQube(!)

    Let’s create a build pipeline that uses these tasks:

    • First add the Prepare analysis on SonarCloud task. This task should be added to the beginning of your pipeline.  In this task you should configure the SonarCloud Service Endpoint, specify an Organization and set a Project Key and Project Name.
      • This information will be used to create a new project inside SonarCloud.
      • A last important step inside this task is to select the way analysis should be done. As we are building a .NET core application, we can set this to ‘Integrate with MSBuild’:

    • Next we should add the Run Code Analysis task. This task should be executed after our code has been build and all tests are executed.
    • At the end of the pipeline we can add the Publish Quality Gate Results step to upload the data to SonarCloud.

    Our full pipeline now looks like this:

    A last tip; I would recommended to not configure this on every CI build as it makes your build time a lot longer.

    Friday, September 18, 2020

    Building a producer/consumer pipeline in .NET Core using Open.ChannelExtensions

    One of the lesser known features of .NET Core is System.Threading.Channels. It allows you to implement a pipeline of producers/consumers without having to worry about locking, concurrency and so on.

    For an introduction have a look here; https://devblogs.microsoft.com/dotnet/an-introduction-to-system-threading-channels/

    Although it would be a good solution for a lot of use cases, I don’t see it used that often. I think the main reason is that the API is not that intuitive and it takes some time to figure out how to use it.

    Let’s have a look at an example; (I borrowed it from Sacha Barb’s great introduction about System.Threading.Channels):

    Although this example is rather trivial, it takes some time to wrap your head around it and understand what is going on.  Let’s see if we can simplify this example through the Open.ChannelExtensions. This library offers a set of extensions for optimizing/simplifying System.Threading.Channels usage.

    Here is the simplified code:

    Thursday, September 17, 2020

    MassTransit–Reading a header using middleware

    After blogging about adding a header yesterday, today let’s have a look at how we can read out this header in a generic way(if you only want to do this for a specific consumer, take a look at this blog post from last week).

    We’ll start by creating the filter:

    Now we need to find a way to register this filter in a generic way so that it is applied for every consumer. The trick is to create an IConsumerConfigurationObserver implementation. This observer will be applied for every consumer that is configured. A perfect fit for our use case:

    Of course we are not there yet. We still need a way to tell MassTransit to apply this observer. This can be done by calling the ConnectConsumerConfigurationObserver method on the IBusFactoryConfigurator:

    Wednesday, September 16, 2020

    MassTransit–Adding a header using middleware

    MassTransit offers a lot of flexibility and it is easy to tweak it to your needs.

    With all this power it is not always easy to find out what is the best way to solve a specific problem. I wanted to include a header with information about the user in every message.

    I solved it by adding two filters; one for sending messages and one for publishing messages:

    Inside these filters I resolved an IUserFactory which gave me access to the underlying user. This is how I implemented this in ASP.NET Core:

    I registered both filters in the IoC container(Autofac in our case) to make sure that the dependencies are correctly resolved:

    And as a last step I configured MassTransit to use these filters:

    Tuesday, September 15, 2020

    C# 8–IAsyncDisposable

    The IDisposable interface has always been an important cornerstone to correctly cleaning up resources when you no longer needed them. Together with the using statement it became an easy pattern to implement and use in C#.

    Unfortunately you couldn’t use the IDisposable when you had some async work to do to cleanup your resources. This is solved with the introduction of the IAsyncDisposable interface in C# 8.

    Be aware that when you implement the IAsyncDisposable interface, and your class is not sealed you should not only implement the interface but also provide a second method DisposeAsyncCore with the following signature:

    This is to help guarantee that all resources are cleaned up correctly when your class is used as a base class.

    Your code can than be used in an ‘await using’:

    More information: Implementing a dispose async method