Monday, November 30, 2020

Visual Studio - No test matches the given testcase filter

When trying to review some code from a colleague, I first tried to run the available unit tests in Visual Studio. The tests were successfully discovered but when I tried to execute them, nothing happened. In the test output window I noticed the following message:

No test matches the given testcase filter

This error made me think that somewhere a testcase filter was set. But where?

I started searching inside the available settings and config files but I couldn’t find anything special. The error message mislead me, the problem was not related to any kind of testcase filter. It turned out that the problem was caused by the fact that the NUnit Test Adapter wasn’t installed.

I opened the Package Explorer and installed the NUnit3TestAdapter.

Friday, November 27, 2020

C# 8–Default!

After enabling ‘Nullable reference types’ in a C# project, I got a lot of the following warnings:

CS8618 - Non-nullable property '<propertyname>' must contain a non-null value when exiting constructor. Consider declaring the property as nullable.

I got this message a lot on my ViewModels that looked like this:

I could follow the suggestion and mark the property as nullable as suggested by the compiler. But I know that the ViewModel is populated from the database and that the provided value could never be ‘null’.

In that case the trick to get rid of the warning is to use a property assignment with a default literal:

Thursday, November 26, 2020

SQL Server Management Studio - Template explorer

I’m using SQL Server Management Studio for a long time but it wasn’t until recently I discovered that it contains a feature called Template Explorer.

What is it?

Through Template Explorer you get access to a large list of scripts useful for a lot of scenario’s. Next time you want to google on how to execute a specific task, take a look at the Template Explorer first. There is a good chance that a script is already available.

Where can I find it?

To locate the templates in SQL Server Management Studio, go to View, click Template Explorer, and then select SQL Server Templates.

More information:

Wednesday, November 25, 2020

Azure DevOps Server- Upgrade ElasticSearch

After upgrading your Azure DevOps Server instance, you should also upgrade your ElasticSearch instance.

If the instance is running on the same server as Azure DevOps, it can be automatically updated using the Configure Search feature in the Azure DevOps Administration console.

In case you are running ElasticSearch on a different machine (what I would recommend) you need to copy the Search service package to the target server. You can find the link in the wizard that is used to Configure Search:

On the remote server extract the installer files and execute the following command:

\Configure-TFSSearch.ps1 --operation update

The installer should be able to find the existing ElasticSearch instance and update it for you.

More information:

Tuesday, November 24, 2020

Snapper - A snapshot does not exist

The first time I ran Snapper it failed with the following error message:

    Snapper.Exceptions.SnapshotDoesNotExistException : A snapshot does not exist.

    Apply the [UpdateSnapshots] attribute on the test method or class and then run the test again to create a snapshot.

  Stack Trace:

    SnapshotAsserter.AssertSnapshot(SnapResult snapResult)

    Snapper.MatchSnapshot(Object snapshot, SnapshotId snapshotId)

    Snapper.MatchSnapshot(Object snapshot, String childSnapshotName)

    Snapper.MatchSnapshot(Object snapshot)

    SnapperExtensions.ShouldMatchSnapshot(Object snapshot)

    GraphQLTests.Queries_Succeed(String queryName) line 57

    --- End of stack trace from previous location where exception was thrown ---

This error makes sense as Snapper expects that a snapshot exists when you run your tests. To fix this, you need to update your tests and add the [UpdateSnapshots] attribute above your test method or class.

When Snapper finds this attribute it will not search for an existing snapshot but create a new one(or replace an existing one). After you have run your tests for the first time you can remove the attribute.

Don’t forget to check in the generated files in the ‘_snapshots’ folder to your source control system:

Monday, November 23, 2020

Using XUnit Theory with Snapper

To test my GraphQL schema I first used SnapShooter.

From the documentation:

Snapshooter is a flexible .Net testing tool, which can be used to validate all your test results with one single assert. It creates simply a snapshot of your test result object and stores it on the file system. If the test is executed again, the test result will be compared with the stored snapshot. If the snapshot matches to the test result, the test will pass.

It is really a good fit to test your GraphQL API. Unfortunately I got into trouble when I tried to combine it with XUnit Theory. I had created 1 tests that loaded multiple graphql files and validates them:

This didn’t work as expected because the same snapshot is used for every test run. As a consequence it fails when the same test runs again with a different parameter value .

Probably there is a way to fix it, but I was lazy so I took a look online and found Snapper, which offers similar features as SnapShooter but supports the XUnit Theory functionality through a feature called child snapshots. Let’s update our example:

Friday, November 20, 2020

Enable .NET 5 in your Azure App Service

After deploying a .NET 5 application in an Azure App Service, I got the following error message when I tried to run the application:

HTTP Error 500.31 - ANCM Failed to Find Native Dependencies
Common solutions to this issue:
The specified version of Microsoft.NetCore.App or Microsoft.AspNetCore.App was not found.

This is because in Azure App Service .NET 5 is not enabled by default (yet).

Let’s see how to fix this:

  • Open the Azure portal
  • Go to the App Service you want to configure
  • Click on Configuration in the Settings section

  • Go to the Stack Settings and change the .NET Framework version to .NET 5(Early Access)

  • Click Save
  • Restart the App Service

Thursday, November 19, 2020

.NET 5–Source Generators–Lessons Learned–Part 3

One of the new features in .NET 5 that triggered my curiosity where source generators. I did some investigation on how to use them and want to share the lessons I learned along the way.

Yesterday I got my first source generator finally up and running. Now it is time to move on and do something more interesting. I found an example on Github that created a strongly typed config class based on your appsettings.json. I tried to duplicate the code but when I build my application the source generator didn’t run.

In the build output I noticed the following warning:







Suppression State

Detail Description



Unable to load Analyzer assembly c:\lib\netstandard2.0\System.Text.Encodings.Web.dll: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.




System.IO.FileNotFoundException: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.File name: 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll' ---> System.IO.DirectoryNotFoundException: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)   at Roslyn.Utilities.FileUtilities.OpenFileStream(String path)   at Roslyn.Utilities.FileUtilities.OpenFileStream(String path)   at Microsoft.CodeAnalysis.Diagnostics.AnalyzerFileReference.GetAnalyzerTypeNameMap(String fullPath, AttributePredicate attributePredicate, AttributeLanguagesFunc languagesFunc)   at Microsoft.CodeAnalysis.Diagnostics.AnalyzerFileReference.Extensions`1.GetExtensionTypeNameMap()   at Microsoft.CodeAnalysis.Diagnostics.AnalyzerFileReference.Extensions`1.AddExtensions(Builder builder)-----System.IO.DirectoryNotFoundException: Could not find a part of the path 'c:\lib\netstandard2.0\System.Text.Encodings.Web.dll'.   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share)   at Roslyn.Utilities.FileUtilities.OpenFileStream(String path)-----

The source generator is using System.Text.Json and System.Text.Encodings.Web to parse the config files. But it seems that the source generator cannot find and use these references.

How to use external references in source generators?

There is some extra work that needs to be done before external libraries can be loaded successfully.

For every reference that your source generator needs you need to do 2 things:

  1. Take a private dependency on the package(PrivateAssets=all). By doing this consumers of the generator will not reference it.
  2. Set the dependency to generate a path property by adding GeneratePathProperty="true". This will create a new MSBuild property of the format PKG<PackageName> where <PackageName> is the package name with . replaced by _.

Now we can then use the generated path property to add the binaries to the resulting NuGet package as we do with the generator itself:

More information in the source generators cookbook:

Wednesday, November 18, 2020

.NET 5–Source generators–Lessons learned–Part 2

One of the new features in .NET 5 that triggered my curiosity where source generators. I did some investigation on how to use them and want to share the lessons I learned along the way.

Yesterday I wrote my first generator. Unfortunately nothing happened and no new code was generated. I tried to attach the debugger but when I did a rebuild still nothing happened. It seems that the code itself was never called.

It took me some time to figure out why it didn’t work; it turns out that you have to add your source generator to a separate assembly(which makes sense). In my first try I just added the source generator logic to the same project as the other code but this doesn’t work.

How to package a source generator?

To get source generators up and running, you need to create a separate project add the source generator logic there.

So create a new .NET Standard project and add a reference to the following package:

Generators can be packaged using the same method as an Analyzer would. Therefore the generator should be placed in the analyzers\dotnet\cs folder of the package for it to be automatically added to the users project on install. To enable this we need to add the following to your project file:

Now we can add our source generators here.

There is one last step you also need to take into account. If you just create a reference to your SourceGenerator project it will not work. You have to change the project reference and include `OutputItemType="Analyzer" ReferenceOutputAssembly="false"`:

Now the source generator is finally invoked when I build my code.

Tuesday, November 17, 2020

.NET 5–Source generators–Lessons learned - Part 1

One of the new features in .NET 5 that triggered my curiosity where source generators. I did some investigation on how to use them and want to share the lessons I learned along the way.

But let’s first start with an explanation what “a source generator” actually is:

A Source Generator is a new kind of component that C# developers can write that lets you do two major things:

  1. Retrieve a Compilation object that represents all user code that is being compiled. This object can be inspected and you can write code that works with the syntax and semantic models for the code being compiled, just like with analyzers today.
  2. Generate C# source files that can be added to a Compilation object during the course of compilation. In other words, you can provide additional source code as input to a compilation while the code is being compiled.

So in short, source generators are a new compiler feature that allow you to inspect existing code and generate new code(remark: you cannot change existing code).

I started with the example mentioned in the blog post above:

But when I build my project, nothing happened and it looked like no code was generated. What was I doing wrong?

How to debug a source generator?

So the first question became how can I debug a source generator? Now I had no clue how to investigate what was (not) happening.

The trick is to add a `Debugger.Launch()` statement inside your code:

Now when you build your code a popup should appear that asks you how you what debugger you want to launch.

Monday, November 16, 2020

GraphQL Hot Chocolate–Enable global authentication

There are multiple ways to enable authentication in Hot Chocolate. Here is simple approach:

Step 1 – Enable ASP.NET Core authentication

First step is to enable authentication at ASP.NET Core level. Let’s use JWT token for authentication:

Step 2- Enable authentication at the root GraphQL query

The second(and already the last step) is to enable authentication on the root query type. By providing no role or policy names we’re simply saying the user must be authenticated.

Friday, November 13, 2020

Visual Studio 2019 16.8– Git Amend

With the release of Visual Studio 2019 16.8 Git became the default source control experience. It no longer lives inside Team Explorer but got its own menu item.

Read the announcement for more details.

One of the things I noticed when using the new Git experience was the first class support for ‘git amend’. But what does ‘git amend’ do?

Let me explain…

The git commit --amend command is a convenient way to modify the most recent commit. It lets you combine staged changes with the previous commit instead of creating an entirely new commit. Amending does not just alter the most recent commit, it replaces it entirely, meaning the amended commit will be a new entity with its own ref. It is a way to rewrite the git history.

Warning: Avoid amending a commit that other developers have based their work on. To be safe only amend local commits.

Thursday, November 12, 2020

Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported

With the release of .NET 5 it was finally time to try record types. Here is my first attempt that I copied from the documentation:

Unfortunately this turned out not the success I was hoping for. Visual Studio returned the following error message:







Suppression State



Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported





The error is not very descriptive but the reason I got it because I forgot to update the project to .NET 5. The fix was easy; after updating the target framework the error disappeared:

CS8632 - The annotation for nullable reference types should only be used in code within a ‘#nullable’ annotations context.

I copy/pasted the following code from an example I found on the web to test source generators(but that is not where this post is about today).

Building this code resulted in the following warning:

Do you have any idea why?

Take a look at the ‘Name’ property. Notice the usage of ‘?’.  This is part of the introduction of nullable reference types and declares the string as nullable.

Nullable reference types are available beginning with C# 8.0, in code that has opted in to a nullable aware context. This is what the warning message is referring to. We need to enable nullable reference type support in our project.

This can be done by adding <Nullable>enable></Nullable> in our csproj file:

Wednesday, November 11, 2020

The target framework 'netcoreapp3.0' is out of support and will not receive security updates in the future.

After upgrading to the latest Visual Studio version(16.8.1) that added support for .NET 5, I got a new warning when opening an existing .NET 3.0 solution:







Suppression State



The target framework 'netcoreapp3.0' is out of support and will not receive security updates in the future. Please refer to for more information about the support policy.


C:\Program Files\dotnet\sdk\5.0.100\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.EolTargetFrameworks.targets


You don’t have to panic when you see this warning. The only thing you need to be aware of is that some versions of .NET have a short support lifecyle where other versions are LTS (Long Term Support) versions. .NET Core 3.0 is a release with a short support lifecycle(support ended March 3 2020) where .NET Core 3.1 is an LTS version.

To get rid of this warning upgrade your app to .NET Core 3.1.

Remark: Be aware that .NET 5 is also a version with short support. Support will end with the release of .NET 6 (planned for February 2022).

Tuesday, November 10, 2020

ORM- Attaching detached object trees

One of the nice features of an ORM is that it allows you to change a complete object tree and that the ORM will figure out what needs to be added, updated and deleted. The reason why this works is because a typical ORM will track by default all objects that are attached to the ORM abstraction(DBContext in EF Core and ISession in NHibernate). Because it tracks these objects (and their relations) it knows what should happen when you make a change.

Unfortunately when you are sending objects to a client through an API(typically using DTO’s and tools like Automapper) and you get a changed object tree back, the objects in this tree are no longer attached to the ORM, so it doesn’t know what should be inserted, updated or deleted. Luckily most ORM’s are smart and use different tricks to figure out what is changed.

However one place where these ORM’s typically fail is when you are using Automapper and mapping a child collection in your DTO’s to your domain model. An example below where we are FluentNHibernate:

Imagine that we have an existing Order entity with  an OrderDetail collection with following data:

{ Id: 22, OrderId: 49 }

And an OrderDetail DTO arrives with the following data:

{ Id: 22, OrderId: 49 }

{ Id: 23, OrderId: 49 }

This should result in an INSERT operation as 1 new OrderDetail is added. Instead this fails with the following error message:

NHibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session: 22, of entity: OrderDetail

The reason why this fails is because Automapper by default creates a new collection when mapping from one type to another. This confuses NHibernate and makes him think that both OrderDetails are new and should be added. (A similar problem exists when using Entity Framework)

A possible fix is to use Automapper.Collection. This will Add/Update/Delete items from a preexisting collection object instead of creating a new collection.

To enable this add the Nuget package to your project and update the Automapper configuration:

Monday, November 9, 2020

Microsoft Orleans- Warning NoOpHostEnvironmentStatistics

When starting up my Orleans cluster I noticed the following warning message:

warn: Orleans.Runtime.NoOpHostEnvironmentStatistics[100708]

      No implementation of IHostEnvironmentStatistics was found. Load shedding will not work yet

To get rid of this warning you need to include the following line in the SiloBuilder config:

This will enable support for Load shedding. This kicks in when the system gets overloaded and prevents new clients from connecting.

Friday, November 6, 2020

Kafka - Avro - Value serialization error

The default serialization format for Kafka is Avro. I mentioned how to use this in C# yesterday.

Today I got into trouble when trying to send a specific message.

I changed the example a little bit to explicitly point out the issue. You see in the code above that I set the value for ‘Browser’ to ‘null’.

When trying to send this message it failed with the following error:

Local: Value serialization error

Let’s have a look at the related avro schema:

The problem is that in the schema is specified that the Browser field should a value of type string. ‘Null’ is not a valid value for string. This explains why it fails.

To solve this I have two options;

1) Either change the code to send an empty string instead of null:

2) Either update the schema to allow null values for the browser field:

More about Avro:

Thursday, November 5, 2020

Kafka- Using Avro as serialization format in C#

To help you with using Avro as the serialization format for your Kafka messages, a .NET core global tool avrogen is available.

  • First install the tool using dotnet tool install:
  • Next step is to specify your message schema. Therefore you need to create an .avsc file and add your message specification:
  • Now it’s time to generate the necessary code:
  • This will generate the following:

The generated type can than be used by your Producer and Consumer logic.

More information:

    Wednesday, November 4, 2020

    DDD–Strongly typed Ids-Using C#9 Record Types

    I blogged before about the usage of Strongly typed Ids in your domain models. With the introduction of C#9 record types, an alternative approach becomes an option.

    Record types are reference types with built-in immutability and value semantics. They automatically provide implementations for Equals, GetHashCode, etc, and offer a very concise syntax known as positional records. This allows us to rewrite our ProductId type using records:

    public record ProductId(int Value);
    That’s all that is needed.

    Thank you C# 9!

    Tuesday, November 3, 2020

    Sending a message through Kafka - Value serializer not specified and there is no default serializer defined for type

    My first attempt to send a typed message through Kafka resulted in the following error message:

    Value cannot be null. (Parameter 'Value serializer not specified and there is no default serializer defined for type PageViewEvent)

    Here is the code I was using:

    As the error message mentions, you need to explicitly specify what serializer should be used for your message object. Therefore you need to use the SchemaRegistryClient and specify a serializer(I’m using Avro in the sample below):

    Monday, November 2, 2020

    Configure a Kafka topic in C#

    By default when you use the Confluent Kafka .NET client, a topic is created automatically for you when you publish your first message.

    However this will create a topic using the default settings. Typically you want to have more control when creating a topic. This is possible through the usage of the AdminClient: