Skip to main content


Showing posts from May, 2020

Using YARP to create a reverse proxy server

So far I’ve always used ProxyKit to create a reverse proxy in ASP.NET Core. But with the announcement of Yarp , it is time to try this alternative… I created a new ASP.NET Core “empty” project: dotnet new web -n ProxyTest -f netcoreapp3.1 The template "ASP.NET Core Empty" was created successfully. Processing post-creation actions... Running 'dotnet restore' on ProxyTest\ProxyTest.csproj...   Restore completed in 278,54 ms for C:\Projects\test\yarptest\ProxyTest\ProxyTest.csproj. Restore succeeded. Next step is to reference the Microsoft.ReverseProxy preview nuget package: <ItemGroup> <PackageReference Include="Microsoft.ReverseProxy" Version="1.0.0-preview.1.*" /> </ItemGroup> Now it is time to update our Startup.cs. This is what I had when using Proxykit: And here is the updated Startup.cs after switching to Yarp: In Yarp everything

GraphQL Inspector

While in traditional REST API’s versioning is a hot topic, GraphQL takes a strong opinion on avoiding versioning by providing the tools for the continuous evolution of a GraphQL schema. As GraphQL only returns the data that is explicitly requested, it becomes easier to introduce new functionality by adding new types and fields without introducing breaking changes. As you know what fields are used by which clients you can have a lot more knowledge in your hands to prevent breaking your clients. For small schema’s it can be feasible to inspect your schema for changes manually but for larger schemas or federated schema’s good tooling becomes a necessity. A tool that can help you to achieve this is GraphQL Inspector . It offers the following (free) features: Compares schemas Detect breaking or dangerous changes Schema change notifications Use serverless functions validate changes Validates Operations and Fragments against a schema Finds similar / duplicated

Hands-on-labs: App modernization

A colleague shared the following hands-on-lab with me: It’s a great starting point to learn about the cloud and take your first steps towards it. It combines a whiteboard design session and a hands-on-lab. This is wat you will design and build:

.NET Core–Generate documentation

Although I try to make my API’s as descriptive as possible, sometimes good documentation can still make a difference. One way to enable documentation generation is through Visual Studio: Right click on your project and select Properties . On the Properties window go to the Build tab. Check the XML documentation file checkbox Don’t forget to save these changes. As a result the following is added to your csproj file: There are a few things I don’t like about this: First a condition is applied to the PropertyGroup which doesn’t seem necessary Second an absolute path is used to define where to generate the documentation XML So I would recommend no longer to use this approach. What you can do instead is directly manipulate the csproj file and add the following line to a PropertyGroup:

Git sparse checkout

With the growing usage of mono-repositories the standard git checkout or git status no longer work and become frustrating slow. A solution would be to use Git LFS (Large File Storage) but not all repositories have this extension installed. An alternative solution can be provided through the (new) git sparse-checkout command. To restrict your working directory to a set of directories, run the following commands: git sparse-checkout init git sparse-checkout set <dir1> <dir2> ... If you get stuck, run git sparse-checkout disable to return to a full working directory. Remark: this feature is part of git 2.25 . So if the command is not recognized check your git version and update first. More information:

Azure Pipelines- Error executing dotnet restore task

When trying to execute dotnet restore during a build it failed with the following error message: 2020-05-12T18:14:36.8332220Z C:\Program Files\dotnet\sdk\3.1.201\NuGet.targets(536,5): error :   The '@' character, hexadecimal value 0x40, cannot be included in a name. Line 6, position 35. [D:\b\4\agent\_work\200\s\IAM.Core\IAM.Core.csproj] 2020-05-12T18:14:36.8820520Z      2>Done Building Project "D:\b\4\agent\_work\200\s\IAM.Core\IAM.Core.csproj" (_GenerateRestoreGraphProjectEntry target(s)) -- FAILED. 2020-05-12T18:14:36.9152564Z      1>Project "D:\b\4\agent\_work\200\s\IAM.Core.Tests\VLM.IAM.Core.Tests.csproj" (1) is building "D:\b\4\agent\_work\200\s\IAM.Core.Tests\IAM.Core.Tests.csproj" (1:5) on node 1 (_GenerateRestoreGraphProjectEntry target(s)). 2020-05-12T18:14:36.9162330Z      1>C:\Program Files\dotnet\sdk\3.1.201\NuGet.targets(536,5): error : NuGet.Config is not valid XML. Path: 'D:\b\4\agent\_work\200\Nuget\t

ASP.NET Core–The magic appearance of IMemoryCache

I created a small security library in .NET Core that simplifies the rather complex security setup we have at one of my clients. Inside this library I’m using the IMemoryCache to cache some non-volatile data. When a colleague tried to use this library he told me that he had to add the following line This doesn’t seem unexpected but the strange this was that in my example project I nowhere added this!? Time to investigate… A walk through the ASP.NET Core source code (always a fun experience to discover and learn something new about the framework) learned me the following; when you call AddMvc() or AddResponseCaching() the framework will register for you an IMemoryCache behind the scenes. If you are using a lower level method like AddControllers() this is not the case. Learned something? Check!

Lens–The Kubernetes IDE

If you are working with Kubernetes I can recommend Lens ,  an open-source and free IDE to take control of your Kubernetes clusters.

Azure Pipelines–DotNet restore error

After configuring a new build pipeline, the build failed with the following error when trying to execute the dotnet restore build task: NuGet.targets(124,5): error :  Unable to load the service index for source http://tfs:8080/DefaultCollection/_packaging/Feed/nuget/v3/index.json NuGet.targets(124,5): error :  No credentials are available in the security package The strange thing was that the same task worked without a problem on other builds. Only for newly created builds it failed with the error message above. A workaround that seemed to work was to switch the dotnet build task to the ‘custom’ command. By using the custom command I can add an extra ‘--force’ arguments to the ‘dotnet restore’ command. By adding this extra argument I got rid of the error message above.

Azure Charts–Help! Azure is evolving too fast…

As most cloud platforms, Azure is evolving quite fast. This makes it hard to keep up-to-date and know where you need to focus your energy on. Azure Charts can help. It is a web based application which allows you to see what the Azure consists of and how it evolves. I would specifically recommend to take a look at the Learning section to see what new Learning content got published. More information:

Azure Pipelines error - NuGet.CommandLine.CommandLineException: Error parsing solution file

After installing the latest Visual Studio version on our build servers, some of our builds started to fail with the following error message: This error only happened on the build servers running MSBuild version One or more errors occurred. ---> NuGet.CommandLine.CommandLineException: Error parsing solution file at D:\b\4\agent\_work\153\s\VLM.MELO.sln: Exception has been thrown by the target of an invocation. at NuGet.CommandLine.MsBuildUtility.GetAllProjectFileNamesWithMsBuild(String solutionFile, String msbuildPath) at NuGet.CommandLine.RestoreCommand.ProcessSolutionFile(String solutionFileFullPath, PackageRestoreInputs restoreInputs) This turns out to a bug in the NuGet client where older versions have trouble with this new version of MSBuild. To resolve this issue in Azure Pipelines, add a NuGet Tool Installer task to your pipeline before any tasks that use NuGet, and set the version field to include the latest version.

Serilog - IDiagnosticContext

The ‘classic’ way I used to attach extra properties to a log message in Serilog was through the LogContext. From the documentation : Properties can be added and removed from the context using LogContext.PushProperty() : Disadvantage of using the LogContext is that the additional information is only available inside the scope of the specific logcontext(or deeper nested contexts). This typically leads to a larger number of logs which doesn’t always help to find out what is going on. Today I try to follow a different approach where I only log a single message at the end of an operation. Idea is that the log message is enriched during the lifetime of an operation and that we end up with a single log entry. This is easy to achieve in Serilog thanks to the IDiagnosticContext interface. The diagnostic context is provides an execution context (similar to LogContext) with the advantage that it can be enriched throughout its lifetime. The request logging middleware then uses this to e

Virtual Azure Community Day–March 2020

In case you missed the last (virtual) Azure Community Day in March, all content is available online: You have 4 tracks with each 8 hours of content! A must see for every Azure addict…

XUnit–Could not load file or assembly 'Microsoft.VisualStudio.CodeCoverage.Shim’

When executing my XUnit tests on the build server, it failed with the following message: System.IO.FileNotFoundException : Could not load file or assembly 'Microsoft.VisualStudio.CodeCoverage.Shim, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. The system cannot find the file specified. Inside my csproj file following packages were referenced: <PackageReference Include="xunit" Version="2.4.0" /> <PackageReference Include="xunit.runner.visualstudio" Version="2.4.0" /> <PackageReference Include="coverlet.collector" Version="1.0.1" /> The ‘xunit.runner.visualstudio’ implicitly has a dependency on the Microsoft.NET.Test.Sdk(at minimum version 15.0) what could explain why he tried to load the assembly mentioned above. To get rid of this error, I had to explicitly include a reference to the ‘Microsoft.NET.Test.Sdk’ : <PackageReference Include="Mic

C# 6–Using static

So far I’ve never used the ‘using static’ directive introduced in C# 6. To simplify the assertions of my tests I created a static TestHelper: I’m using this TestHelper to simplify my NetArchTests (but more about that in another blog post). Inside my tests I can now do the following: Neat!

Turning your .NET Core 3 Worker Service into a Windows Service

Until the introduction of the Worker Service in .NET Core 3 I always used TopShelf to turn my console application into a windows service. Let’s see how we can do this using the built-in packages… Start by creating a new worker project from the command line(or open Visual Studio and search for ‘worker’ in the available templates) dotnet new worker Now add the following nuget package to our project: Install-Package Microsoft.Extensions.Hosting.WindowsServices Next we have to modify our program.cs file and add a “UseWindowsService()”. That’s it! Of course, you don’t have to believe me like that so let’s try to install our newly created windows service. This can be done through the standard Windows Service installer: sc create ExampleService BinPath=C:\Projects\test\ExampleService\bin\Debug\netcoreapp3.1>ExampleService.exe Now open up your services window and have a look:

Newtonsoft JSON magic

In one of our applications, we had the following code: We were using events to share information between different modules in our Modulith. These same events were also stored in an Event Log as a source for possible projections into a read model. However in case of the code above, the ‘operationId’ for was always {00000000-0000-0000-0000-000000000000} when deserializing the ProductCreated event. Do you notice what is wrong? Important to notice here is that our ‘OperationId’ is a readonly property that is set through the constructor. You may already wonder why the JSON serializer is smart enough to fill in this property if only a getter exists. This is because it has a neat trick where it looks at the constructor of the object and if it can find a corresponding parameter, it will call the constructor and pass on the parameter value. Now that you know this, take a second look at the code above. The reason it fails is because there is a typo in the constructor of the ProductCre