Wednesday, September 30, 2020

.NET Core Plugin architecture

A plugin architecture remains a valuable option to make your application extensible. With the introduction of AssemblyLoadContext and AssemblyDependencyResolver in .NET Core 3, creating and loading plugins became a lot easier.

Still if you need more features I would recommend having a look at the DotNetCorePlugins project: https://github.com/natemcmaster/DotNetCorePlugins

Usage is simple through the PluginLoader class:

One nice feature it adds to the mix is hot reloading. This will allow you to dynamically update assemblies on the fly:

Tuesday, September 29, 2020

Azure DevOps–Dependency Tracker

When working with multiple teams in Azure DevOps that have cross team dependencies, it can become quite hard to track what is going on.

If you recognize this problem I would recommend having a look at the Dependency Tracker extension for Azure DevOps: https://marketplace.visualstudio.com/items?itemName=ms-eswm.dependencytracker&ssr=false#overview

The Dependency Tracker extension provides you with the ability to plan and manage dependencies across teams, projects, and even organizations. It provides filterable views to show all dependencies a team is consuming and producing.

You can see both producting and consuming dependencies:

And there is even a timeline view available:

The easiest way to manage your dependencies is to use the predecessor/successor link type to link work items together:

Create links manually

More information: https://docs.microsoft.com/en-us/azure/devops/boards/extensions/dependency-tracker

Monday, September 28, 2020

Angular 10–Strict mode

With the release of Angular 10 a more strict project setup is available. You can enable this by adding a –strict flag when creating a new Angular project:

ng new --strict

This strict mode allows to have more build-time optimizations, help to catch bugs faster and improves overall maintainability of your Angular application.

Therefore it applies the following changes:

  • Enables strict mode in TypeScript
  • Turns template type checking to Strict
  • Default bundle budgets have been reduced by ~75%
  • Configures linting rules to prevent declarations of type any
  • Configures your app as side-effect free to enable more advanced tree-shaking

There is no out-of-the box option to enable this for an existing project(after creating it without the –strict flag or after upgrading from a previous version) but you can apply the same changes manually:

  • Add the following rules to your tsconfig.json:
"compilerOptions": {
  "strict": true,
  "forceConsistentCasingInFileNames": true,
  "noFallthroughCasesInSwitch": true
},
"angularCompilerOptions": {
  "strictInjectionParameters": true,
"strictInputAccessModifiers": true, "strictTemplates": true }
  • Update the tslint.json:
"no-any": true
  • Update the bundle budget sizes in your angular.json:
"configurations": {
  "production": {
    "budgets": [
      {
        "type": "initial",
        "maximumWarning": "500kb",
        "maximumError": "1mb",
      },
      {
        "type": "anyComponentStyle",
        "maximumWarning": "2kb",
        "maximumError": "4kb",
      },
    ]
  }
}
  • Add a schematics to your projects.[projectName].schematics path in the angular.json:

schematics: {
  "@schematics/angular:application": {
    "strict": true
  }
}
More information: https://blog.angular.io/angular-cli-strict-mode-c94ba5965f63



Friday, September 25, 2020

Computer stuff they didn’t teach you…

It is always fun to see Scott Hanselman in action. And in this video series he explains a lot of IT concepts in simple and concise way.

A good way to end your work week…

Thursday, September 24, 2020

GraphQL–The query has non-mergable fields

I constructed the following query in GraphQL playground:

In this query I try combine 2 calls, one to fetch a product by id and another one where I try to fetch a product by name.

However when I tried to execute this query it resulted in the following error message:

The query has non-mergable fields

The problem is that GraphQL tries to merge the results in one product result object which does not work.

To have the behavior I want I need to specify an alias for every query:

Now when I execute the query, the query results are put in their own object:

Wednesday, September 23, 2020

Error when running Sonar Cloud on Angular projects

And the story continues…

After setting up Sonar Cloud integration in Azure DevOps and applying a fix for .NET Core applications, I tried to do the same thing for our Angular code.

It didn’t work as expected, this is the output I got:

[More Information](https://sonarcloud.io/documentation/analysis/scan/sonarscanner-for-azure-devops/)

==============================================================================

D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe end

SonarScanner for MSBuild 4.10

Using the .NET Framework version of the Scanner for MSBuild

Post-processing started.

18:04:47.123 Fetching code coverage report information from TFS...

18:04:47.125 Attempting to locate a test results (.trx) file...

18:04:47.64 Looking for TRX files in: D:\a\1\TestResults

18:04:47.64 No test results files found

18:04:48.125 Did not find any binary coverage files in the expected location.

18:04:48.127 Falling back on locating coverage files in the agent temp directory.

18:04:48.128 Searching for coverage files in D:\a\_temp

18:04:48.128 No coverage files found in the agent temp directory.

##[error]The SonarQube MSBuild integration failed: SonarQube was unable to collect the required information about your projects.

Possible causes:

1. The project has not been built - the project must be built in between the begin and end steps

2. An unsupported version of MSBuild has been used to build the project. Currently MSBuild 14.0.25420.1 and higher are supported.

3. The begin, build and end steps have not all been launched from the same folder

4. None of the analyzed projects have a valid ProjectGuid and you have not used a solution (.sln)

The SonarQube MSBuild integration failed: SonarQube was unable to collect the required information about your projects.

Possible causes:

1. The project has not been built - the project must be built in between the begin and end steps

2. An unsupported version of MSBuild has been used to build the project. Currently MSBuild 14.0.25420.1 and higher are supported.

3. The begin, build and end steps have not all been launched from the same folder

4. None of the analyzed projects have a valid ProjectGuid and you have not used a solution (.sln)

Generation of the sonar-properties file failed. Unable to complete SonarQube analysis.

##[error]18:04:48.176 Post-processing failed. Exit code: 1

18:04:48.176 Post-processing failed. Exit code: 1

##[error]The process 'D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe' failed with exit code 1

Finishing: Run Code Analysis

Did you notice my mistake? As I’m building an Angular application, I shouldn’t be using MSBuild. Instead I need to use the stand alone scanner.

Let’s fix this in the Prepare Analysis task:

Tuesday, September 22, 2020

Error when running Sonar Cloud on .NET Core projects

Yesterday I talked about setting up Sonar Cloud for code analysis.

Unfortunately it turned out to be not so simple as I expected. The first time I ran the build pipeline the Code analysis task failed with the following error:

2020-09-14T17:47:52.6441174Z ##[section]Starting: Run Code Analysis

2020-09-14T17:47:52.6563714Z ==============================================================================

2020-09-14T17:47:52.6564026Z Task         : Run Code Analysis

2020-09-14T17:47:52.6564329Z Description  : Run scanner and upload the results to the SonarCloud server.

2020-09-14T17:47:52.6564583Z Version      : 1.15.0

2020-09-14T17:47:52.6564794Z Author       : sonarsource

2020-09-14T17:47:52.6565319Z Help         : Version: 1.15.0. This task is not needed for Maven and Gradle projects since the scanner should be run as part of the build.

[More Information](https://sonarcloud.io/documentation/analysis/scan/sonarscanner-for-azure-devops/)

2020-09-14T17:47:52.6565909Z ==============================================================================

2020-09-14T17:47:52.9698863Z [command]D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe end

2020-09-14T17:47:53.0469811Z SonarScanner for MSBuild 4.10

2020-09-14T17:47:53.0470290Z Using the .NET Framework version of the Scanner for MSBuild

2020-09-14T17:47:53.0984270Z Post-processing started.

2020-09-14T17:47:54.3358218Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.shared\bridges.cms.modules.contentcreation.shared.csproj"

2020-09-14T17:47:54.3359968Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.infrastructure\bridges.cms.modules.contentcreation.infrastructure.csproj"

2020-09-14T17:47:54.3361069Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.domain\bridges.cms.modules.contentcreation.domain.csproj"

2020-09-14T17:47:54.3362264Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.application\bridges.cms.modules.contentcreation.application.csproj"

2020-09-14T17:47:54.3364005Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.dataaccess\bridges.cms.modules.contentcreation.dataaccess.csproj"

2020-09-14T17:47:54.3365002Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.archtests\bridges.cms.modules.contentcreation.archtests.csproj"

2020-09-14T17:47:54.3366234Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.integrationtests\bridges.cms.modules.contentcreation.integrationtests.csproj"

2020-09-14T17:47:54.3367348Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.api\bridges.cms.api.csproj"

2020-09-14T17:47:54.3368778Z WARNING: Duplicate ProjectGuid: "00000000-0000-0000-0000-000000000000". The project will not be analyzed by SonarQube. Project file: "d:\a\1\s\bridges.cms.modules.contentcreation.unittests\bridges.cms.modules.contentcreation.unittests.csproj"

2020-09-14T17:47:54.3370405Z ##[error]No analysable projects were found. SonarQube analysis will not be performed. Check the build summary report for details.

2020-09-14T17:47:54.3371973Z No analysable projects were found. SonarQube analysis will not be performed. Check the build summary report for details.

2020-09-14T17:47:54.3431567Z Generation of the sonar-properties file failed. Unable to complete SonarQube analysis.

2020-09-14T17:47:54.3479798Z ##[error]17:47:54.346  Post-processing failed. Exit code: 1

2020-09-14T17:47:54.3481669Z 17:47:54.346  Post-processing failed. Exit code: 1

2020-09-14T17:47:54.3668330Z ##[error]The process 'D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe' failed with exit code 1

2020-09-14T17:47:54.3739014Z ##[section]Finishing: Run Code Analysis

The problem is that SonarCloud expects a ProjectGuid to be able to differentiate between the different C# projects(.csproj files ) I have in my source repository. As this are .NET Core projects such a ProjectGuid no longer exist.

To solve this problem I changed my dotnet build task to use the solution file instead of using csproj files:

Monday, September 21, 2020

Running code analysis through Sonar Cloud in Azure DevOps

Sonar Cloud is the SaaS version of SonarQube, a static code analyzer. It can inspect your code against a set of quality standards, detect bugs, security vulnerabilities,  calculate technical debt and see how your code quality evolves over time.

If you want to use it in Azure DevOps you should first install the SonarCloud extension from the marketplace: https://marketplace.visualstudio.com/items?itemName=SonarSource.sonarcloud

After the extension is installed you get 3 new build tasks:

Remark: Notice that this are not the same build tasks as should be used when using SonarQube(!)

Let’s create a build pipeline that uses these tasks:

  • First add the Prepare analysis on SonarCloud task. This task should be added to the beginning of your pipeline.  In this task you should configure the SonarCloud Service Endpoint, specify an Organization and set a Project Key and Project Name.
    • This information will be used to create a new project inside SonarCloud.
    • A last important step inside this task is to select the way analysis should be done. As we are building a .NET core application, we can set this to ‘Integrate with MSBuild’:

  • Next we should add the Run Code Analysis task. This task should be executed after our code has been build and all tests are executed.
  • At the end of the pipeline we can add the Publish Quality Gate Results step to upload the data to SonarCloud.

Our full pipeline now looks like this:

A last tip; I would recommended to not configure this on every CI build as it makes your build time a lot longer.

Friday, September 18, 2020

Building a producer/consumer pipeline in .NET Core using Open.ChannelExtensions

One of the lesser known features of .NET Core is System.Threading.Channels. It allows you to implement a pipeline of producers/consumers without having to worry about locking, concurrency and so on.

For an introduction have a look here; https://devblogs.microsoft.com/dotnet/an-introduction-to-system-threading-channels/

Although it would be a good solution for a lot of use cases, I don’t see it used that often. I think the main reason is that the API is not that intuitive and it takes some time to figure out how to use it.

Let’s have a look at an example; (I borrowed it from Sacha Barb’s great introduction about System.Threading.Channels):

Although this example is rather trivial, it takes some time to wrap your head around it and understand what is going on.  Let’s see if we can simplify this example through the Open.ChannelExtensions. This library offers a set of extensions for optimizing/simplifying System.Threading.Channels usage.

Here is the simplified code:

Thursday, September 17, 2020

MassTransit–Reading a header using middleware

After blogging about adding a header yesterday, today let’s have a look at how we can read out this header in a generic way(if you only want to do this for a specific consumer, take a look at this blog post from last week).

We’ll start by creating the filter:

Now we need to find a way to register this filter in a generic way so that it is applied for every consumer. The trick is to create an IConsumerConfigurationObserver implementation. This observer will be applied for every consumer that is configured. A perfect fit for our use case:

Of course we are not there yet. We still need a way to tell MassTransit to apply this observer. This can be done by calling the ConnectConsumerConfigurationObserver method on the IBusFactoryConfigurator:

Wednesday, September 16, 2020

MassTransit–Adding a header using middleware

MassTransit offers a lot of flexibility and it is easy to tweak it to your needs.

With all this power it is not always easy to find out what is the best way to solve a specific problem. I wanted to include a header with information about the user in every message.

I solved it by adding two filters; one for sending messages and one for publishing messages:

Inside these filters I resolved an IUserFactory which gave me access to the underlying user. This is how I implemented this in ASP.NET Core:

I registered both filters in the IoC container(Autofac in our case) to make sure that the dependencies are correctly resolved:

And as a last step I configured MassTransit to use these filters:

Tuesday, September 15, 2020

C# 8–IAsyncDisposable

The IDisposable interface has always been an important cornerstone to correctly cleaning up resources when you no longer needed them. Together with the using statement it became an easy pattern to implement and use in C#.

Unfortunately you couldn’t use the IDisposable when you had some async work to do to cleanup your resources. This is solved with the introduction of the IAsyncDisposable interface in C# 8.

Be aware that when you implement the IAsyncDisposable interface, and your class is not sealed you should not only implement the interface but also provide a second method DisposeAsyncCore with the following signature:

This is to help guarantee that all resources are cleaned up correctly when your class is used as a base class.

Your code can than be used in an ‘await using’:

More information: Implementing a dispose async method

Monday, September 14, 2020

ASP.NET Core–Validate claims based on controller-action

For a CRUD style application we created some base controllers that encapsulate most of the logic. For every action create/read/update/delete a different claim applies and we would like to add an authorizationfilter to the base action methods.

To handle this scenario, we created a custom authorizationfilter that uses the controllername and actionname to check for a specific claim:

Friday, September 11, 2020

.NET Core Tools

Starting with .NET Core 2.1 Microsoft introduced the Dotnet Tools platform as a simple way to consume existing (small) .NET Core applications through NuGet. In a way it is quite similar to installing tools through NPM.

You can see the tools you already installed by executing the following command:

Dotnet Tool list –g

This should output the list of global tools:

C:\Users\BaWu>dotnet tool list -g
Package Id         Version          Commands
-------------------------------------------------
dotnet-ef          3.0.0            dotnet-ef
dotnet-format      3.3.111304       dotnet-format
dotnet-try         1.0.19553.4      dotnet-try

Question is how can we easily find tools that are out there?

Go to nuget.org, click on Packages and hit the Filter button to see a list of filter options.

One of the options you have is to filter on .NET Tool Package type. That should do the trick…

More information here: https://docs.microsoft.com/en-us/dotnet/core/tools/global-tools

Thursday, September 10, 2020

GraphQL - Faker

A great way to mock your GraphQL API (or extend an existing API) is through GraphQL Faker.

It can act as a proxy between your existing API and your app making it really easy to try out new use cases and develop your frontend and backend in parallel.

Installation

To use it, first install it through NPM:

npm install -g graphql-faker

Usage

Now you can either create a completely new GraphQL api through:
graphql-faker --open
Or extend an existing API:
graphql-faker ./ext-swapi.graphql --extend http://swapi.apis.guru
Now you can open up the editor at http://localhost:9002/editor/ and start using faker directives in our GraphQL schema. You can use the @fake directive to specify how to fake data,  @listLength to specify number of returned array items and the @examples directive to provide concrete examples:

Save the model. Now you can browse to http://localhost:9002/graphql and start exploring your fake GraphQL schema.

Wednesday, September 9, 2020

Domain Driven Design–3 rules to help you get started

I recently started a new project where we are using the DDD principles to build our software. As most of my team members are new to DDD, I noticed some obvious mistakes. Here are some rules that can help you avoid the same mistakes…

Rule #1: Your domain should be modelled first

When you start a new project spend enough time on the domain before introducing a frontend and a database. Try to identify your bounded contexts, what are your entities? What can be a good aggregate? Which domain events are needed?

This allows you to really focus on the domain itself and explore it in depth. This is also a perfect time to introduce TDD(Test Driven Development) in your team.

Rule #2: Your domain should reflect your business logic not your database

If you start with Rule #1 you are already on the correct path. In too much cases I see that the domain is immediately setup in such a way that it works with the ORM tool the team has chosen. If the ORM needs getters and setters, they are added. If the ORM needs all properties to be public, it is changed.

Another thing I see is that people (especially if they are using a relational database) immediately start applying normalization techniques to let the domain better match with the database (and decrease the impedance mismatch). Unfortunately this leads to domain models that no longer reflect the domain and are coupled to your data storage mechanism.

Rule #3: Your domain should always be valid

Your domain should always represent a valid state. This means encapsulating validation inside your domain and ensuring that nobody can change the internal state. So no setters but only methods that encapsulate the behavior of your system.

This helps you to avoid a lot of extra guards and if checks spread around your code. And in the end this is where good object oriented design was all about…

Tuesday, September 8, 2020

MassTransit–Create a scoped filter that shares the scope with a consumer

Here is what I wanted to achieve:

I want a MassTransit filter that reads out some user information from a message header and store it in a scoped class. Idea is that the same scope is applicable inside my consumer so that I’m guaranteed that the correct audit information is written into the database.

This use case doesn’t sound too hard and a first look at the documentation showed that you can have scoped filters; https://masstransit-project.com/advanced/middleware/scoped.html#available-context-types

Unfortunately these filters doesn’t share the same scope with the consumer and turn out not to be a good solution for my use case. As I mentioned yesterday I discovered the way to go when having a look at the Masstransit pipeline:

What I need is an implementation of a ConsumerConsumeContext filter(quite a mouthful).

Let’s take a look at the steps involved:

  • First we create a ConsumerConsumeContext filter.
    • Notice that to get it working I had to use the Service Locator pattern and resolve the IoC instance through the context provided in the send method.
  • Now we need to add the necessary registrations. I’m registering the IUserFactory that stores the users data as a scoped instance
    • I’m using Autofac here, but the code is quite similar for other IoC containers
  • As a last step we need to apply the filter to our consumers. I’m doing this through a definition file but there are other ways to configure this as well:

Monday, September 7, 2020

MassTransit–Receive pipeline

The last few days I have been struggling with middleware and filters in MassTransit. I had a hard time understanding what was really going on.

Until I stumbled over the following image in the documentation:

I wished I had seen this image anytime sooner. It would have saved me a lot of time!

As a sidenote; it is a good example how the right amount of documentation can really make a difference. I got into the habit to spend more time on documenting what I’m building and this not only helps me to structure my thoughts but also helps a lot in the design process and even made some flaws in my reasoning visible early.

So my advice for today; document more. (I hope my team is reading this)

Friday, September 4, 2020

.NET Core Nuget - Showing a readme file after restore

Iin .NET Core and .NET Standard projects content or tools distributed as part of a NuGet package are no longer installed. However there is one (hidden?) feature that still works and that is to show a readme file. Let’s see how to get this done:

  • Add a readme.txt file to the project that you package through nuget.
  • Open the .csproj filed and add the following entry:

This will embed a readme.txt file in the root folder of the package. When the package is restored, the file gets displayed by Visual Studio:

Thursday, September 3, 2020

GraphQL–HotChocolate–How to handle an incomplete schema

HotChocolate does a lot of effort to validate that you produced a valid schema before your application starts up. This is useful when a feature is ready but during development it can be annoying as maybe you already want to run your application before having resolvers ready for every field in your schema.

By default when you try to run with an incomplete schema, you’ll get an error message similar to the following:

An error occurred while starting the application.

SchemaException: The schema builder was unable to identify the query type of the schema. Either specify which type is the query type or set the schema builder to non-strict validation mode.

HotChocolate.SchemaBuilder.Create() in SchemaBuilder.Create.cs, line 0

· SchemaException: The schema builder was unable to identify the query type of the schema. Either specify which type is the query type or set the schema builder to non-strict validation mode.

o HotChocolate.SchemaBuilder.Create() in SchemaBuilder.Create.cs

As mentioned inside the error message you can change this behavior by setting the schema builder to non-strict validation mode. But how exactly can you do this?

I’ll provide you the answer; during schema creation you can modify the options:

Wednesday, September 2, 2020

Azure DevOps - Improve build time

During our last retrospective, one of the complaints was that the build time was too long. We were using the Hosted Build Services on Azure and restoring the NPM packages on every build took way too long.

Fortunately, Azure DevOps now offers a new build task called cache task. This task will restore the cached data based on the provided inputs. Let’s see how we can use this task to cache our NPM packages:

  • Add the cache task to your build steps

  • Now we first need to set a cache key. The cache key is be used as the identifier for the cache and can be a combination of string values, file paths and file patterns. In our case we want to use the package-lock.json file as the key. Every time the package-lock.json changes the cache will be invalidated and the npm packages will be restored again.
    • **/package-lock.json, !**/node_modules/**/package-lock.json, !**/.*/**/package-lock.json
  • The second thing we need to set is the directory to populate the cache from and restore back to. As we want to cache the npm packages we cache the ‘node_modules’ folder:
    • $(System.DefaultWorkingDirectory)/Bridges.CMS.SPA/node_modules

Now if we build again, the first time the cache is populated and the second time we win a lot of time as the node_modules folder is restored from the cache. Yippie!


Tuesday, September 1, 2020

ASP.NET Core Swagger error - Conflicting method/path combination

After adding a new method to a controller, my OpenAPI(Swagger) endpoint started to complain with the following error message:

Conflicting method/path combination "POST api/Orders" for actions - eShopExample.Web.Controllers.OrdersController.CreateOrder (eShopExample),eShopExample.Web.Controllers.OrdersController.ProcessOrders (eShopExample). Actions require a unique method/path combination for Swagger/OpenAPI 3.0. Use ConflictingActionsResolver as a workaround

I find the error message itself not very insightful but taking a look at my OrdersController made it all clear:

The problem was that I had 2 methods that were both using attribute based routing(through the [HttpPost] attribute) but where resolved to the same URI; “api/orders”. To fix it I had to use an overload of the [HttpPost] attribute and specify an alternative URI: