Friday, December 24, 2021

ADFS - Windows authentication

One of my clients is using Microsoft ADFS as their Security Token Service. Through ADFS people can login either by using their e-id(Belgian passport) or through their internal domain account (using Windows authentication).

After upgrading our ADFS server, we noticed that people were asked for their credentials and that their windows credentials were not passed automatically.

This is of course quite annoying.

We took a look at the ADFS settings and noticed that ‘mozilla/5.0’ was missing from the list of user agents:

PS C:\Users\bawu> (Get-AdfsProperties).Wiasupporteduseragents

MSAuthHost/1.0/In-Domain

MSIE 6.0

MSIE 7.0

MSIE 8.0

MSIE 9.0

MSIE 10.0

Trident/7.0

MSIPC

Windows Rights Management Client

MS_WorkFoldersClient

=~Windows\s*NT.*Edge

To fix it we updated the list of supported agents:

Set-ADFSProperties -WIASupportedUserAgents @("MSAuthHost/1.0/In-Domain","MSIE 6.0", "MSIE 7.0", "MSIE 8.0", "MSIE 9.0", "MSIE 10.0", "Trident/7.0", "MSIPC", "Windows Rights Management Client", "MS_WorkFoldersClient", "=~Windows\s*NT.*Edge", "Mozilla/5.0")

That did the trick!

Thursday, December 23, 2021

Elastic APM–Use .NET OpenTelemetry

Elastic has their own Application Performance Monitoring solution as part of their Elastic Observability product.

An important part of the solution are ‘agents’. Agent are responsible for instrumenting your application, collecting all the metrics and sending it to the APM server.

Specifically for .NET the APM agent is released as a serie of NuGet packages.

With the release of OpenTelemetry for .NET I was wondering if we could replace this Elastic APM specific solution with standard OpenTelemetry.

In a first incarnation APM server didn’t support the OpenTelemetry standard and you had to use a seperate collector that converts the OpenTelemetry data to the Elastic APM format:

Since version 7.13 of Elastic APM this is no longer necessary.The OpenTelemetry Collector exporter for Elastic was deprecated and replaced by the native support of the OpenTelemetry Line Protocol in Elastic Observability (OTLP).

Now the only thing you need to do is to add some specific attributes and configure the OltpExporter:

Wednesday, December 22, 2021

Visual Studio 2022–New breakpoint types

Of course it is always better to not write any bugs, but I know I make mistakes. So sooner or later I need to debug my code to find what’s going wrong.

In that case breakpoints are an important aid to halt your application at the right location. Visual Studio 2022 introduces 2 new breakpoint types; temporary and dependent breakpoints:

The Temporary breakpoint is used to set a breakpoint that will only break once. When debugging, the Visual Studio debugger only pauses the running application once for this breakpoint and then removes it immediately after it has been hit.

To set a temporary breakpoint, hover over the breakpoint symbol, choose the Settings icon, and then select Remove breakpoint once hit in the Breakpoint Settings window:

You can also use the right-click context menu to set the temporary breakpoint by selecting Insert Temporary Breakpoint from the context menu:

The Dependent breakpoint is used to set a breakpoint that will only break when another breakpoint is hit. This can make debugging code in common paths such as game loop or a utility API much easier because a breakpoint in those functions can be configured to enable only if the function is invoked from a specific part of your application.

To set a dependent breakpoint, hover over the breakpoint symbol, choose the Settings icon, and then select Only enable when the following breakpoint is hit in the Breakpoint Settings window.

In the dropdown, select the prerequisite breakpoint you want your current breakpoint to be dependent on.

You can also use the right-click context menu to set the dependent breakpoint by selecting Insert Dependent Breakpoint from the context menu:

Tuesday, December 21, 2021

Error after upgrading to Microsoft.Data.SqlClient 4

After upgrading to Microsoft.Data.SqlClient 4, I immediatelly started to get connection failures.

Let’s have a look at the exact error message:

A connection was successfully established with the server, but then an error occurred during the login process. (provider: SSL Provider, error: 0 - The certificate chain was issued by an authority that is not trusted.)

The reason of this error is that with the release of version 4.0, a breaking change was introduced to improve security:

The default value of the `Encrypt` connection option changed from `false` to `true`. With the increased emphasis on secure-by-default, the growing use of cloud databases, and the need to ensure connections are secure, Microsoft decided it was time for this backwards-compatibility-breaking change.

You can of course get back to the previous situation by explicitly setting the ‘encrypt’ option to ‘false’.

"server=exampleserver;database=ExampleDB;integrated security=True;Encrypt=False"

But a better solution is to enable encrypted connections to the database.

Monday, December 20, 2021

Kubernetes Job Containers - (Forbidden): jobs.batch "example-migration" is forbidden

For our database migrations we are usingKubernetes Jobs and init containers as discussed here.

However when we tried to deploy the job container, it failed with the following error:

Error from server (Forbidden): jobs.batch "example-migration" is forbidden: User "system:serviceaccount:example-ns:default" cannot get resource "jobs" in API group "batch" in the namespace "example-ns": Azure does not have opinion for this user.

To read and list jobs, the deployment is using the default service account in the “example-ns” namespace. This default service account does not have the necessary api rights in the kubernetes cluster.

To fix it we created a new service account, role and role binding:

After doing that, we had to update our deployment to use this service account:

Friday, December 17, 2021

Become a master at Git and Open Source

Interested in contributing to an Open Source project? But your experience with Git is rather limited, than this learning course is one for you!

Sign up for this free course here.

You’ll get regular mails covering following topics:

  • Learning Git by exploring open-source repositories
  • Exploring a Git repository using Visual Studio
  • Contributing to an open source project
  • Add your existing code to Git and GitHub
  • Next steps

Thursday, December 16, 2021

Swagger UI - Add required header

One of our REST API’s always requires an ‘X-API-Key’ header. To simplify testing, I wanted to have the option to specify the header in the Swagger UI.

Let’s see how we can get this done through SwashBuckle.

First I need to create a custom IOperationFilter that will add the header:

Now we need to update our Swagger configuration to use this header:

If we now run our application and browse to the Swagger UI, we should see the extra header parameter:

Wednesday, December 15, 2021

ASP.NET Core 6–Http Logging

With the release of ASP.NET Core 6, a new Http Logging middleware became available. This middleware transmit requests as log messages through the Microsoft.Extensions.Logging framework.

To enable the middleware, add the following code to your Program.cs:

This will add an instance of the Microsoft.AspNetCore.HttpLogging.HttpLoggingMiddleware class.

If you now run your application, you will notice no difference and nothing is logged. This is because the HTTP Logging messages are handled as informational messages and these are not enabled by default.

To adjust the log level, you can add the following line to the Logging > LogLevel configuration in the appsettings.json file:

You can further configure the logging through the AddHttpLogging method:

Tuesday, December 14, 2021

Visual Studio 2022–Inline hints

Bring enough .NET developers together and sooner or later someone will start the discussion if we should use the ‘var’ keyword or not.

You have people in the ‘pro’ camp who like that they have to type less and don’t worry too much about the specific type. You have people in the ‘contra’ camp who prefer to have explicit typing.

In Visual Studio 2022, you can get the best of both worlds by enabling ‘inline hints hints’. Inlay hints can display parameter name hints for literals, function calls and more.

  • To enable this feature, go to Tools > Options > Text Editor > C# or Basic > Advanced.
  • Check the ‘Display inline parameter name hints’ checkbox
  • Check the ‘Display inline type hints’ checkbox

Now we can see that both our arguments and the var types are annotated:

Monday, December 13, 2021

ASP.NET Core - Who stole my cookie?

I stumbled over a strange issue I had in ASP.NET Core. A cookie created in ASP.NET MVC didn’t show up in my requests in ASP.NET Core.

This cookie was used to share some user preferences between 2 subsites in the same domain. The original cookie was created without setting a SameSite value but was also not marked as secure or httponly.

I first updated the cookie generation logic to set both the Secure and HttpOnly value to true. This is not strictly required but as I’m only reading the cookie in the backend this is already a step in the right direction.

Let’s now focus on the SameSite value. SameSite is an IETF draft standard designed to provide some protection against cross-site request forgery (CSRF) attacks. Cookies without SameSite header are treated as SameSite=Lax by default. If you know that we need SameSite=None to allow this cookie for cross-site usage, that should certainly be our next step.

Remark: Cookies that assert SameSite=None must also be marked as Secure.

Although I could now finally see using the Web Developer tools that the cookie certainly is available as part of my request, it didn’t show up yet inside my ASP.NET Core application when checking the Cookies property on my Http request.

There was one other problem with my cookie and this was related to the data in the cookie.Let’s have a look:

Do you notice the ‘\’ in the value? This is causing the problem; the original code didn’t URL encode the cookie values. Although the old ASP.NET MVC application didn’t seem to bother, ASP.NET Core certainly does and didn’t want to load and parse my cookie.

I had to update the original code to start url encoding the value:

Friday, December 10, 2021

Simple made easy

One of the inspirations for the name of my blog comes from this presentation by Rick Hickey.  In case you’ve never watched this video, time to take it off your bucket list!

Watch it on InfoQ or on Youtube.

Thursday, December 9, 2021

Visual Studio 2022 - Test execution with Hot reload

Building your code before you can run your tests can be a big part of the time needed to run your tests. The build time inside Visual Studio can vary depending on the kind of changes made to the code. For larger solutions, builds can be the most expensive part of the test run.

Visual Studio 2022 includes an experimental(!) feature, that allow you to use hot reload to speed up the test execution by skipping builds for supported scenarios.

To start using this feature, you first need to enable it by choosing Test > Options > "(Experimental) Enable Hot Reloaded Test Runs for C# and VB test projects targeting .NET 6 and higher":

Now when you execute a test in Visual Studio, Test Explorer will automatically use test execution with hot reload when possible. If a hot reload is not possible, it will fall back to the regular behavior of building and running tests. The nice thing is that this is all happening behind the scenes and you, as a developer, should not do anything different.

Remark: This feature only works when building your projects for the DEBUG configuration

More information: https://docs.microsoft.com/en-us/visualstudio/test/test-execution-with-hot-reload?amp;view=vs-2022

Wednesday, December 8, 2021

ASP.NET Core - Could not load type 'Microsoft.AspNetCore.Mvc.MvcJsonOptions'

After upgrading some packages in an ASP.NET Core application, I got the following error message when I tried to run the application:

System.TypeLoadException: Could not load type 'Microsoft.AspNetCore.Mvc.MvcJsonOptions' from assembly 'Microsoft.AspNetCore.Mvc.Formatters.Json, Version=3.1.20.0, Culture=neutral, PublicKeyToken=adb9793829ddae60'.

   at System.Signature.GetSignature(Void* pCorSig, Int32 cCorSig, RuntimeFieldHandleInternal fieldHandle, IRuntimeMethodInfo methodHandle, RuntimeType declaringType)

   at System.Reflection.RuntimeConstructorInfo.get_Signature()

   at System.Reflection.RuntimeConstructorInfo.GetParametersNoCopy()

   at System.Reflection.RuntimeConstructorInfo.GetParameters()

   at Microsoft.Extensions.Internal.ActivatorUtilities.CreateInstance(IServiceProvider provider, Type instanceType, Object[] parameters)

   at Microsoft.AspNetCore.Builder.UseMiddlewareExtensions.<>c__DisplayClass4_0.<UseMiddleware>b__0(RequestDelegate next)

   at Microsoft.AspNetCore.Builder.ApplicationBuilder.Build()

   at Microsoft.AspNetCore.Hosting.GenericWebHostService.StartAsync(CancellationToken cancellationToken)

This issue was caused because I was using a newer Microsoft.AspNetCore.Mvc package together with an older Swashbuckle package. To solve the problem I had to update Swagger to version 5:

<PackageReference Include="Swashbuckle.AspNetCore" Version="5.0.0" />
That’s it!

Tuesday, December 7, 2021

Blazor–Using Basic authentication

For an internal application I’m building I needed to use Basic authentication.

Remark: In case you forgot, Basic Authentication transmits credentials like user ID/password as a base64 encoded string in the Authorization header. This is of course not the most secure way as any man-in-the-middle can capture and read this header data.

If you look around on the Internet, a typical example on how to this in .NET looks like this:

We create a HttpClientHandler, set the Credentials and pass it to our HttpClient as a parameter.

Of course it is even better to not create an HttpClient instance yourself but instead use the HttpClientFactory to create a Named or Typed HttpClient.

But if you try to use the code above in a Blazor application, you’ll end up with the following runtime error:

System.PlatformNotSupportedException: Property Credentials is not supported.

As the HttpClient implementation for Blazor stays as close as possible to the fetch api the browser, you’ll need to take a different approach and set the authorization header directly:

Monday, December 6, 2021

.NET 6 - The ArgumentNullException helper class

.NET 6 introduces a new helper class that uses the new [CallerArgumentExpression] attribute and the [DoesNotReturn] attribute; the ArgumentNullException helper class.

This class gives you an easy-to-use helper class that throws an ArgumentNullException for null values.

Thanks to the [CallerArgumentExpression] attribute this helper method gives you better error messages as it can capture the expressions passed to a method.

This is the implementation of this helper class:

Before C# 10, you probably would have used the nameof keyword and implemented this helper class like this:

Friday, December 3, 2021

.NET Conf 2021 - Sessions, slides and demos are online

In case you missed .NET Conf 2021, no worries, all sessions are recorded and available on the .NET YouTube channel or the new Microsoft Docs events hub. With over 80 sessions, you will know what to do during the Christmas Holidays. Slidedecks and demos can be found on the .NET Conf 2021 GitHub page. Have fun!

 

Thursday, December 2, 2021

.NET Core–The case of the disappearing authorization header

While building an internal (Blazor) application, I stumbled over some CORS issues. As this was an internal application, I decided to be lazy and just disable CORS on my request:

In the example above I’m using the  SetBrowserRequestMode() to disable the CORS preflight check.

Afther doing that the CORS issue was gone, unfortunately my application still didn’t work because now I got a 401 response back?!

I was quite confident that the provided username/password combination was correct. So what is going on?

I monitored my request using the browser developer tools and I noticed that the authorization header was missing:

What was going on?

The MDN documentation brought me the answer when I had a look at the request mode documentation specifically for ‘no-cors’:

no-cors — Prevents the method from being anything other than HEAD, GET or POST, and the headers from being anything other than simple headers. If any ServiceWorkers intercept these requests, they may not add or override any headers except for those that are simple headers. In addition, JavaScript may not access any properties of the resulting Response. This ensures that ServiceWorkers do not affect the semantics of the Web and prevents security and privacy issues arising from leaking data across domains.

So this explains why the authorization header is removed before sending the request.

Another reason why being lazy as a developer is not always a good idea. So I went back to my application and enabled CORS on the API I was calling…

Wednesday, December 1, 2021

Keep your project dependencies up to date with dotnet outdated

From the documentation:

When using Visual Studio, it is easy to find out whether newer versions of the NuGet packages used by your project are available, by using the NuGet Package Manager. However, the .NET Core command-line tools do not provide a built-in way for you to report on outdated NuGet packages.

dotnet-outdated is a .NET Core Global tool that allows you to quickly report on any outdated NuGet packages in your .NET Core and .NET Standard projects.

This is a great way to keep your applications up-to-date and can easily be integrated as part of your DevOps processes.

Install dotnet-outdated as a global tool:

dotnet tool install --global dotnet-outdated-tool

Now you can invoke it from your project or solution folder:

dotnet outdated

This is how the output looks like for one of my projects:

The colors make it very clear. Here is the related legend:

You can automatically upgrade packages by passing the ‘-u’ parameter:

dotnet outdated -u

Tuesday, November 30, 2021

GraphQL HotChocolate 12 - Updated Application Insights monitoring

It seems that with every release of HotChocolate, I can write a follow up post. With the release of HotChocolate 11, I wrote a blog post on how to integrate it with Application Insights.

With the HotChocolate 12, there was again a small update in the available interfaces and API’s.

Let’s check the changes we have to make…

  • First of all, our diagnostic class should no longer inherit from DiagnosticEventListener but from ExecutionDiagnosticEventListener.
  • The signature of the ExecuteRequest method has changed as well. Instead of returning an IActivityScope it should return an IDisposable:
  • This also means that our RequestScope no longer needs to implement the IActivityScope interface but only needs to implement IDisposable:

Here is the full example:

Monday, November 29, 2021

C# 10–Change an existing project to file scoped namespaces

C# 10 introduces file scoped namespaces. This allows you to remove the ‘{}’ when your source file only has one namespace(like most files typically have).

So instead of writing:

you can now write:

To apply this for an existing project written in C# 9 or lower, you can do this in one go.

Therefore set the language version of your project to C# 10:

Now we need to update our .editorconfig file and add the following line:

After doing that Visual Studio will help us out and we can use “Fix all occurences in Solution” to apply it in one go:

 

Friday, November 26, 2021

Running Azure on your laptop–Part 3–Prerequisites

In the previous post in this series I talked about why Azure Arc is also interesting for developers. Today we finally move on to the more practical part and try to get it up and running on our local machine.

Let’s first focus on what you need to have up and running on your local machine first:

As we want to run Azure Arc on our local machine, we also need to have a local AKS cluster up and running. You can use Minikube, MicroK8S, KIND (Kubernetes in Docker), or any other flavor you like that can be installed locally. I tested both in MiniKube and KIND.

Now we can move on to the Azure side. Let’s see what we need there:

  • An active Azure subscription

Register resource providers

Now we need to register 3 extra resource providers in our Azure subscription before we can use Azure Arc-enabled Kubernetes:

az provider register --namespace Microsoft.Kubernetes

az provider register --namespace Microsoft.KubernetesConfiguration

az provider register --namespace Microsoft.ExtendedLocation

Remark: Registration is an asynchronous process, and may take approximately 10 minutes.

You can monitor the registration process with the following commands:

az provider show -n Microsoft.Kubernetes -o table

az provider show -n Microsoft.KubernetesConfiguration -o table

az provider show -n Microsoft.ExtendedLocation -o table

You can also enable this through the Azure Portal:

  • Go to Subscriptions. Select the correct subscription.
  • Click on Resource providers. Enter the name of the resource provider in the filter box. Select the Resource Provider and click on Register.

Install CLI extensions

We also need some extensions installed for the Azure CLI:

az extension add --upgrade --yes --name connectedk8s

az extension add --upgrade --yes --name k8s-extension

az extension add --upgrade --yes --name customlocation

Remark: Make sure you have the latest version of both the CLI and the extensions installed.

Thursday, November 25, 2021

Running Azure on your laptop–Part 2 - Azure Arc for Developers

In the previous post in this series I talked about Azure Arc and it’s multiple flavors. Although one Azure managed control plane for all your resources no matter if there are on premise, on Azure or hosted at another cloud provider sounds great if you are an IT ops guy(or girl) but why should you care as a developer as well?

It is important to understand that the Azure Arc story has 2 dimensions.

1. Arc enabled infrastructure

The first dimension is the Arc enabled infrastructure. This is the part that I already talked about and that allows you to connect and control hybrid resources like they are native Azure resources. This allows you to use additional Azure Services like Azure Policy, Azure Monitor, and so on to govern, secure and monitor these services.

2. Arc enabled services

The second dimension is Arc enabled services. Once you have an Arc enabled infrastructure, you can start to deploy and run Azure Services outside Azure while still operation them from Azure. This allows you to run Azure Services like App Services, Functions, Logic Apps, API Management and Event Grid outside Azure!

How does this work?

Azure application services with Azure Arc is available as an extension on Azure Arc-enabled Kubernetes clusters that allow you to easily install the App Service control plane on Kubernetes and start deploying your applications to it. So in our next posts, we’ll focus on how to Arc-enable your Kubernetes cluster and get the Azure Application services cluster extension up and running.

Wednesday, November 24, 2021

Running Azure on your laptop– Part 1–What is Azure Arc?

Before I dive into the details on how to get Azure Arc up and running on your laptop, it would be a good idea to start with a short introduction.

Therefore we first have to dive in how Azure works. The hearth of the Azure ecosystem is the Azure control plane. This control plane manages all the resources you can find in Azure. It helps you to inventorize, organize and govern all resources and multiple tools exist that can help you to interact with it (think ARM templates, Bicep, Terraform, Pulumi, …)

You probably know this control plane better as the Azure Resource manager. It controls and manage all the Azure resources which can be as big as a Kubernetes cluster and as small as a static ip address. These resources run inside an Azure region, one of the datacenters that Microsoft has all around the world.

So where does Azure Arc fits into this picture?

If we bring Azure Arc into the picture, we can bring resources that are not running on Azure to the Azure control plane and by doing that we can start using all services on top of the Azure Resource manager to secure, monitor, protect, … these resources. These resources can be running in your own datacenter or even at one of the other cloud providers!

Today, Azure Arc allows you to manage the following resource types hosted outside of Azure:

  • Servers - both physical and virtual machines running Windows or Linux.
  • Kubernetes clusters - supporting multiple Kubernetes distributions.
  • Azure data services - Azure SQL Managed Instance and PostgreSQL Hyperscale services.
  • SQL Server - enroll instances from any location with SQL Server on Azure Arc-enabled servers.

I think this really is a game changer as it allows you to start using all the knowledge, experience and tooling you’ve build up on Azure outside the Microsoft datacenters.

The future looks bright…

Tuesday, November 23, 2021

Running Azure on your laptop–Introduction

As mentioned yesterday I promised to write a series of follow up posts about my ‘Running Azure on your laptop’ session. I’ll use this post as a placeholder to point to the different parts.

Microsoft is more and more embracing a hybrid cloud approach. As part of this evolution, an increasing amount of ‘Azure only’ services become available outside Azure. This idea is not new, people who work long enough in the Microsoft ecosystem maybe remember Azure Pack,  which was a way to install Azure software on your own hardware.It gave you the Azure portal and some of it’s services. I never tried it myself and I don’t know any customer who used it in the wild.

A couple of years later, Microsoft announced the Azure Pack’s successor, Azure Stack. This was a hardware appliance, that you could install in your own datacenter. Over time, the name evolved to Azure Stack Portfolio as multiple flavors of Azure Stack became available. Azure Stack is still available today and keeps evolving.

At Ignite 2019, another solution was introduced; Azure Arc. With Azure Arc, Microsoft hybrid story continues by allowing you to manage resources from within Azure while they can be running practically anywhere.

Azure Arc supports managing & operating Virtual Machines, SQL servers, and Kubernetes clusters that run on any cloud provider, hybrid scenario, or on-premises infrastructure that is fully managed through Microsoft Azure.

If you want to learn more about Microsoft hybrid cloud story, check https://azure.com/hybrid.

I’ll write multiple posts but I’ll make sure to update this post with the full list.

Monday, November 22, 2021

VisugXL - Running Azure on your laptop using Azure Arc

Last weekend I gave a presentation at VisugXL about Azure Arc. I’ll write a few follow-up posts explaining the steps I took to get it all up and running(and where I got into trouble).

If you can’t wait until then, here is already the presentation:

Friday, November 19, 2021

.NET 6–Breaking changes

Although Microsoft takes a lot of effort to maximize backwards compatibility, migrating to .NET 6 can result in breaking changes that might affect you.

So before you start to upgrade have a look at the list of breaking changes maintained here: https://docs.microsoft.com/en-us/dotnet/core/compatibility/6.0

Tuesday, November 16, 2021

Azure DevOps–Run a tool installed as npm package

As part of our build pipeline we wanted to use GraphQL Inspector to check if our updated GraphQL schema contained any breaking changes.

GraphQL Inspector is available as a commandline tool and can be installed through NPM.

So the first step was to use the NPM task to install the tool:

But now the question is how can we invoke the installed tool?

This is possible thanks to NPX. NPX stands for Node Package Execute and it comes with NPM. It is an npm package runner that can execute any package that you want from the npm registry.

I added a command line task to invoke npx:

Remark: When using NPX you even don’t need to install the package first. This means that the NPM task I created first is not necessary.

Monday, November 15, 2021

ASP.NET Core - Build a query string

What is wrong with the following code?

Nothing you would say? What if I passed ‘Bert & Ernie’ as the searchterm parameter?

The problem is that I’m using string interpolation to build up the query. This could be OK if you have full control on the passed parameters but in this case it is input coming from a user. The example above would lead to an incorrect query string.

Writing the correct logic to handle ampersands, question marks and so on would be a challenge. Luckily ASP.NET Core offers a QueryHelpers clas with an AddQueryString function:

public static string AddQueryString(string uri, string name, string value);

public static string AddQueryString(string uri, IDictionary<string, string> queryString);

Let’s update our code example to use this:

That's better!

Wednesday, November 10, 2021

Azure DevOps Pipelines–A local file header is corrupt

A colleague contacted me with the following question; he tried to run a specific Azure Pipelines build but before even one build task could be executed, the build failed.

I asked him to send me the logs and he shared the following screenshot:

As you can see the build fails in the ‘Job initialization’ phase while downloading a specifc task ‘VersionAssemblies’.

The strange this was than when I searched for this build task I couldn’t find it between the list of installed extensions on the Azure DevOps server.

I took a look at the Azure DevOps marketplace and even there this specific build task was non-existent. Strange!

At least this explained the error message I got, as the build pipeline probably couldn’t find the task either.

In the end I fixed it by introducing an alternative build task that achieved the same goal(updating the AssemblyInfo with a build number).

Tuesday, November 9, 2021

Azure DevOps–SonarQube error

As part of our build pipelines, we run a code analysis through SonarQube. After moving SonarQube to a different server, our Azure DevOps pipelines started to fail.

When I opened the build logs, I noticed the following error message:

ERROR: JAVA_HOME exists but does not point to a valid Java home folder. No “bin\java.exe” file can be found there.

I logged in on our SonarQube server and checked the value of the JAVA_HOME environment variable:

JAVA_HOME = c:\program files\Zulu\zulu-11\bin\

Based on the error message above, it seems that the SonarScanner expects that we don’t include the ‘bin’ folder. So I updated the environment variable to:

JAVA_HOME = c:\program files\Zulu\zulu-11

After rescheduling the build, the SonarQube analysis task completed succesfully.

Monday, November 8, 2021

GraphQL Crash Course

If you want to get started with GraphQL, you can have a look at the following video:

Remark: This is part of a bigger course that is available on Udemy.

Friday, October 29, 2021

GraphQL–Strawberry Shake GraphQL client

Until recently I always used the GraphQL.Client as the GraphQL client of my choice. This client is straightforward and easy-to-use.

For a new project I decided to give Strawberry Shake a try. Strawberry Shake was created by Chilicream, the creators of the HotChocolate GraphQL backend for .NET.

Strawberry Shake is using a different approach as the GraphQL.Client as it heavily relies on code generation and looks similar to the Apollo GraphQL client from a design point of view.

I mostly followed the “Get started” documentation to get the Strawberry Shake client up and running, but I didn’t get everything up and running immediatelly so I’ll add some extra detail on the points where I got into trouble.

Add the CLI tools

  • We start by adding the Strawberry Shake CLI tools
  • Open the folder that contains the project where you want to add the Strawberry Shake GraphQL client.
  • Now wee need to first create a dotnet tool-manifest.

dotnet new tool-manifest

Getting ready...

The template "Dotnet local tool manifest file" was created successfully.

  • After doing that we can install the Strawberry Shake tools locally.

dotnet tool install StrawberryShake.Tools –local

You can invoke the tool from this directory using the following commands: 'dotnet tool run dotnet-graphql' or 'dotnet dotnet-graphql'.

Tool 'strawberryshake.tools' (version '12.0.1') was successfully installed. Entry is added to the manifest file C:\projects\graphqlclientexample\.config\dotnet-tools.json.

Add the NuGet packages

  • Now we need to add some NuGet packages:

dotnet add package StrawberryShake.Transport.Http

dotnet add package StrawberryShake.CodeGeneration.CSharp.Analyzers

dotnet add package Microsoft.Extensions.DependencyInjection

dotnet add package Microsoft.Extensions.Http

Add the GraphQL client

  • Next step is to add a client using the following command  dotnet graphql init {{ServerUrl}} -n {{ClientName}}.

dotnet graphql init https://example.graphql.be/graphql/ -n ExampleClient

Download schema started.

Download schema completed in 399 ms

Client configuration started.

Client configuration completed in 137 ms

  •  A .graphqlrc.json is generated together with a schema.graphql file and a schema.extensions.graphql file:

    Add a GraphQL query

    • At this moment no code is generated yet. Therefore we have to write our first graphql query.
    • Create a new .graphql file and write the query you want to execute
    • The next step mentioned in the documentation is that you only need to compile your code to let the code generation do its work. But nothing happened when I tried to do that.
    • I discovered that I had to take one extra step. I needed to set the build action for the graphql file to GraphQL compiler:

    • Now when I build my code a Generated folder appears containing an ExampleClient. StrawberryShake.cs file.
    • Register the generated client in your Startup.cs file:

    Use the generated query

    • To use the query I first need to inject the generated client:
    • Then I can invoke the query in a type safe manner:

    Thursday, October 28, 2021

    Azure Pipelines - Unable to determine the location of vstest.console.exe

    A colleague forwarded me a question about a failing build pipeline. When I took a look at the build results, I noticed that the Visual Studio Test task was failing.

    Inside the logs I found more details explaining what was going on:

    ##[warning]No results found to publish.

    ##[debug]Processed: ##vso[task.logissue type=warning]No results found to publish.

    ##[error]System.Management.Automation.CmdletInvocationException: Unable to determine the location of vstest.console.exe ---> System.IO.FileNotFoundException: Unable to determine the location of vstest.console.exe

    ##[debug]Processed: ##vso[task.logissue type=error;]System.Management.Automation.CmdletInvocationException: Unable to determine the location of vstest.console.exe ---> System.IO.FileNotFoundException: Unable to determine the location of vstest.console.exe

       at Microsoft.TeamFoundation.DistributedTask.Task.Internal.InvokeVSTestCmdlet.GetVsTestLocation()

       at Microsoft.TeamFoundation.DistributedTask.Task.Internal.InvokeVSTestCmdlet.ProcessRecord()

       at System.Management.Automation.CommandProcessor.ProcessRecord()

       --- End of inner exception stack trace ---

       at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)

       at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)

       at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)

       at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)

       at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)

       at Microsoft.TeamFoundation.DistributedTask.Handlers.LegacyVSTSPowerShellHost.VSTSPowerShellHost.Main(String[] args)

    ##[error]LegacyVSTSPowerShellHost.exe completed with return code: -1.

    On the build server only the latest Visual Studio 2019 Build tools were installed. I noticed that he was still using an older Visual Studio Test Task version. This older version could only handle older Visual Studio version.

    I changed it to the latest version:

    Now I could select Visual Studio 2019(or latest)  as the Test platform version:

    I triggered the build again and this time it succeeded.

    Wednesday, October 27, 2021

    .NET Tools - Cannot find a manifest file

    A .NET tool is a special NuGet package that contains a console application. You can install a .NET tool as a global tool (using the --global argument) or as a local tool (using the  --local argument).

    However when I tried to install a specific tool locally, it failed with the following error message: “Cannot find a manifest file.”

    dotnet tool install StrawberryShake.Tools --local

    Cannot find a manifest file.

    For a list of locations searched, specify the "-d" option before the tool name.

    If you intended to install a global tool, add `--global` to the command.

    If you would like to create a manifest, use `dotnet new tool-manifest`, usually in the repo root directory.

    To install a tool for local access only, it has to be added to a tool manifest file. As I didn’t create such a file, I got the error message mentioned above.

    To fix this, we first need to create a tool manifest file by running the dotnet new tool-manifest command:

    dotnet new tool-manifest

    Getting ready...

    The template "Dotnet local tool manifest file" was created successfully.

    This command creates a manifest file named dotnet-tools.json under the .config directory.

    Now we can retry the install command and this time it will succeed:

    dotnet tool install StrawberryShake.Tools --local

    You can invoke the tool from this directory using the following commands: 'dotnet tool run dotnet-graphql' or 'dotnet dotnet-graphql'.

    Tool 'strawberryshake.tools' (version '12.0.1') was successfully installed. Entry is added to the manifest file C:\projects\IAMCore\.config\dotnet-tools.json.

    If we take a look at the manifest file, we can see the following info:

    More information: https://docs.microsoft.com/en-us/dotnet/core/tools/global-tools

    Tuesday, October 26, 2021

    vscode.dev : Bringing VS Code to the browser

    Although I’m still using Visual Studio (or Rider depending on the mood) for my day to day C# development(and F# occasionally), I use Visual Studio Code for all other languages and web development.

    With vscode.dev, your favorite code editor becomes available everywhere without the need to leave the browser and install anything.

    Thanks to the File System Access API support in modern browser, vscode.dev can access the local file system. This enables scenario’s like local file viewing end editing.

    Integration with Github and Azure DevOps is also available allowing you to sync your changes with repositories on both platforms.

    However don’t expect that vscode.dev is already on par with the desktop version in terms of functionality. For example, there's no internal debugging or terminal with vscode.dev.

    More information: https://code.visualstudio.com/blogs/2021/10/20/vscode-dev

    Monday, October 25, 2021

    MassTransit - Stop handling erroneous messages

    By default when a MassTransit consumer fails to handle a message (and throws an exception), the message is moved to an _error queue (prefixed by the receive endpoint queue name). This is OK for transient exceptions but probably not what you want when you have a bug in your system or there is another reason why none of the messages can be handled succesfully.

    In that case, another feature of MassTransit becomes handy; the kill switch.

    A Kill Switch is used to prevent failing consumers from moving all the messages from the input queue to the error queue. By monitoring message consumption and tracking message successes and failures, a Kill Switch stops the receive endpoint when a trip threshold has been reached.

    You can configure a kill switch for a specific endpoint or for all receiver endpoints on the bus.

    Here is a short example on how to configure the kill switch for all receiver endpoints:

     

    In the above example, the kill switch will activate after 10 messages have been consumed. If the ratio of failures/attempts exceeds 15%, the kill switch with trip and stop the receive endpoint. After 1 minute, the receive endpoint will be restarted. Once restarted, if exceptions are still observed, the receive endpoint will be stopped again for 1 minute.

    If you want to learn more about this feature, check out this video by Chris Patterson, the creator of Masstransit:

    Friday, October 22, 2021

    Service decomposition and service design

    Finding the boundaries of your system and decompose it into multiple services sounds easy, but it certainly isn’t.

    If you are interested in this topic, check out the blog series by Vadim Samokhin:

    Remark: After writing this post, I noticed that Vadim created a blog post linking to the series above and included also some other related posts.

    Thursday, October 21, 2021

    Azure AKS–Save some money using spot node pools

    One of the ways you can save some money using Azure is by using spot node pools for your Azure Kubernetes Service cluster.

    What’s a spot node pool?

    Using a spot node pool allows you to take advantage of unused Azure capacity at a significant cost savings. At any point in time when Azure needs the capacity back, the Azure infrastructure will evict spot nodes. Therefore, Spot nodes are great for workloads that can handle interruptions like batch processing jobs, dev/test environments, large compute workloads, and more.

    Remark: A spot node pool can't be the cluster's default node pool. A spot node pool can only be used for a secondary pool.

    Pricing for a spot node pool

    Pricing for spot instances is variable, based on region and SKU. For more information, see pricing for Linux and Windows. You do have the option to set a max price. In case the price is exceeded the spot node is evicted from your cluster.

    Schedule a deployment to use the spot node pool

    A spot node pool has the label kubernetes.azure.com/scalesetpriority:spot and the taint kubernetes.azure.com/scalesetpriority=spot:NoSchedule. We use this information to add a toleration in our deployment.yaml:

    In case you have multiple spot node pools, you can use a nodeselector to select a specific pool:

    More information

    Winget–A package manager for Windows

    I’ve been using Chocolatey for a long time as an easy way to get my Windows machine configured with all the software I need. With the release of version 1.1 of the Windows Package Manager(WinGet) I thought it was a good time to give it a try.

    Installation

    Chances are high that WinGet is already available on your machine. Open a terminal and type winget. If it is available you should see something like this:

    If not, the Windows Package Manager is distributed with the App Installer from the Microsoft Store. You can also download and install the Windows Package Manager from GitHub, or just directly install the latest available released version.

    Searching a package

    The list of available packages is quite large(more than 2,600 packages in the Windows Package Manager app repository). Just run winget search <SomePackage> to see if the package you are looking for has available there.

    For example let’s search for my favorite git client GitKraken:

    PS C:\Users\bawu> winget search gitkraken
    Naam      Id                Versie Bron
    ------------------------------------------
    GitKraken Axosoft.GitKraken 8.1.0  winget

    For packages inside the Microsoft store you don’t get  a readable id but a hash value instead:

    PS C:\Users\bawu> winget search git
    Name                                  Id                                         Version                    Source
    -------------------------------------------------------------------------------------------------------------------
    Learn Pro GIT                         9NHM1C45G44B                               Unknown                    msstore
    My Git                                9NLVK2SL2SSP                               Unknown                    msstore
    GitCup                                9NBLGGH4XFHP                               Unknown                    msstore
    GitVine                               9P3BLC2GW78W                               Unknown                    msstore
    GitFiend                              9NMNKLTSZNKC                               Unknown                    msstore
    GitIt                                 9NBLGGH40HV7                               Unknown                    msstore
    GitHub Zen                            9NBLGGH4RTK3                               Unknown                    msstore
    GitLooker                             9PK6TGX9T87P                               Unknown                    msstore
    Bhagavad Gita                         9WZDNCRFJCV5                               Unknown                    msstore
    Git                                   Git.Git                                    2.33.1                     winget
    GitNote                               zhaopengme.gitnote                         3.1.0         Tag: git     winget
    Agent Git                             Xidicone.AgentGit                          1.85          Tag: Git     winget
    TortoiseSVN                           TortoiseSVN.TortoiseSVN                    1.14.29085    Tag: git     winget
    TortoiseGit                           TortoiseGit.TortoiseGit                    2.12.0.0      Tag: git     winget

    Installing a package

    After you have found the package you want, installing it is as easy as invoking the following command:

    winget install --id <SomePackage>

    Of course the real fun starts when you create a script that contains all the packages you need for you day-to-day work. Here is the script I’m using: