Thursday, April 30, 2020

Domain Driven Design–Get rid of your anemic domain model

Although most developers have heard about Domain Driven Design, most applications I encounter today are the traditional ‘layered cake’ with an anemic domain model.

I have to admit that even a lot of applications I’ve helped building the last years are still using this approach. Why? 2 reasons;

  • Lack of DDD knowledge in the teams. Most developers ‘know’ the tactical DDD patterns but are not aware of the strategic DDD patterns. What makes it even worse is that even the tactical patterns like the repository pattern are wrongly applied. Getting a team up to speed with DDD takes time, time that we unfortunately not always have in ‘fixed price, fixed time, fixed scope’ projects. Another reason to go for the ‘product not project’ approach.
  • Lack of access to Domain experts.  Good domain modelling can only work if the necessary expertise is around to help you find the correct ubiquitous language, identify bounded contexts and create a shared understanding of the business. Too much organizations are still siloed where a developer can talk to a functional analyst, a functional analyst can talk to a business analyst, a business analyst can talk to the business(and in fact not even the real user but someone who was assigned as a representative). It’s one of the reasons I like BDD and ‘the 3 amigos’ as it helps to close the gap.

Enough excuses… How does a well designed DDD project looks like? There are not a lot of good examples out there.

One example I can recommend that can help you get started is this one: https://github.com/asc-lab/better-code-with-ddd. It created the same application once using the traditional layered approach and once using DDD.  Also take a look at the article; https://altkomsoftware.pl/en/blog/create-better-code-using-domain-driven-design/.

Wednesday, April 29, 2020

Microsoft Build 2020 - Digital Event

This year Microsoft Build will be fully digital. An unique opportunity to get a non-stop, 48-hour interactive experience.

It will be more than just an online streaming event. You can help shape the agenda by submitting a specific topic during registration and even get 1:1 expert guidance from the Microsoft engineers!

Register here: https://mybuild.microsoft.com/

Tuesday, April 28, 2020

Azure AppService–How to identify a bottleneck?

My first answer on the question in the title would be to use Application Insights. But in case you didn’t configure Application Insights for your AppService, here is an alternative way to gain some insights.

To immediately spoil the answer, we will use Project Kudu to look behind the curtains of our Azure App Service.

How to find Project Kudu?

Process Explorer

  • Click on the Process Explorer at the top
    • Remark: This is only available when your are running the Windows version of an App Service
  • Now you can see all the processes and even start profiling

Monday, April 27, 2020

C# 8–Null-coalescing

Although C# 8 is released for a while, I’m still discovering new features.

When I want to conditionally assign a value to a variable whether it is null or not, I typically used either the conditional ternary operator:

or a coalesce expression:

With C# 8, you can even write this shorter with the null coalescing operator:

Friday, April 24, 2020

XUnit - Make your tests more readible

One of the neat tricks that you can do in XUnit is to make your test more readible.

Here is one of my unit tests:

And here is how this looks by default in the Test Explorer:

Adding a configuration file

Let’s now add a configuration file to improve the default behavior.

  • Add a new JSON file to the root of your test project. Name the file xunit.runner.json.

  • Add a schema reference to get autocomplete behavior while editing the file:

    • Remark: In case of Visual Studio, this is not really necessary as it will recognize the file name and apply the correct schema automatically. 
  • Tell the build system to copy the file into the build output directory. Edit your .csproj file and add the following:

    <ItemGroup>
      <Content Include="xunit.runner.json" CopyToOutputDirectory="PreserveNewest" />
    </ItemGroup>

Update the configuration

Let’s now tweak the behavior by changing some settings in the configuration file:

This will make our test list already more readable:

Remark: If you don’t see anything happening, check that you have the latest version of XUnit and XUnit.Runner.VisualStudio installed.

I invite you to check out the documentation for some other options you can tweak: https://xunit.net/docs/configuration-files

Thursday, April 23, 2020

How to keep secrets in Azure Functions?

While doing a code review of an Azure Function I noticed the following line:

var personalAccessToken = Environment.GetEnvironmentVariable("KeyVault_PersonalAccessToken", EnvironmentVariableTarget.Process);

Could it be that the developer directly stored a secret in an environment variable? Instead of using a secure storage like Azure Keyvault?

Inside the configuration on the Azure Portal I found the following:

@Microsoft.KeyVault(SecretUri=https://sample-vault.vault.azure.net/secrets/personal-access-token/<removedthekey>)

Luckily my assumption was wrong and it turns out that is a feature I wasn’t aware of in Azure (Functions).

To use this feature you first have to create a Managed Service Identity for your Azure Functions app as described here: https://docs.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=dotnet#creating-an-app-with-an-identity

Once you have a Managed Service Identity you can add an Access policy in your Azure Keyvault:

Now you can copy the secret identifier from the secret you want to use:

This secret identifier should be added with a special connectionstring to your Azure Function configuration:

 @Microsoft.KeyVault(SecretUri={theSecretUri}).

Thanks Dario for teaching me this trick!

Wednesday, April 22, 2020

.NET Core - Using ClaimsIdentity in your unit tests

To be able to test my application I had to create a ‘dummy’ ClaimsPrincipal and ClaimsIdentity:

However when I tried to run my test, the test failed because the ClaimsIdentity returned ‘false’ for the IsAuthenticated check.

To create an authenticated ‘dummy’ identity I had to pass on an extra ‘authenticationType’ parameter to the constructor:

The exact value that you specify doesn't matter.

Tuesday, April 21, 2020

.NET Core–Using IOptions<> in your unit test

An easy and typesafe way to use configuration values in your .NET core applications is through IOptions<T>.

This allows you to create a settings section in your appsettings.json:

And then map this to a specific class:

The only thing you need to do is to specify the section in your Startup.cs:

Now you can inject these settings in your code through IOptions<T>:

But what if you want to unit test this class? How can you create an IOptions<T> instance?

A solution exists through the Options.Create() method:

Monday, April 20, 2020

Building Secure and Reliable Systems

Google continues sharing there experience with Site Reliability Engineering(SRE) with a new ebook: Building Secure and Reliable Systems.

Can a system be considered truly reliable if it isn't fundamentally secure? Or can it be considered secure if it's unreliable? Security is crucial to the design and operation of scalable systems in production, as it plays an important part in product quality, performance, and availability. In this book, experts from Google share best practices to help your organization design scalable and reliable systems that are fundamentally secure.

Friday, April 17, 2020

Use your own machine with Visual Studio Online

With Visual Studio Online, you get a fully managed development environment in the cloud on demand. Recently they announced that you can also register your own machine and access it remotely through Visual Studio Code or the Visual Studio Online Web Editor.

Why would you want to do that?

Microsoft mentions the following reasons:

This is a great option for developers that want to cloud-connect an already configured work or home machine for anywhere access, or take advantage of the Visual Studio Online developer experience for specialized hardware we don’t currently support.

Let’s try it!

First make sure that you have an existing development plan:

  • Login using your company or Microsoft account
  • Check the top bar.
    • You have an existing plan? Great you can skip to the next step.

    • You don’t have an existing plan? Let’s continue…

  • Click on the Create environment button

  • The Create a new billing plan window appears. Enter the necessary details.

  • Click on Create. Now the Create an Enviroment window appears.

  • As we want to use our own machine, we can skip this step. Just click Cancel.

Now it’s time to register our machine through the CLI. First we have to download and install the CLI:

  • On Windows:
    • Install via Powershell by download and executing our script.
  • On macOS:
    • brew install microsoft/vsonline/vso
  • On Linux:
    • apt install vso

Once the CLI is installed, run ‘VSO start’. During the process, you will be asked to login through your browser.

[2020-04-17 07:09:30.147 CLI I] vso v1.0.2005.59801 (pid: 22936)
[2020-04-17 07:09:30.454 CLI I] Authenticating...
To sign in, use a web browser to open the page
https://microsoft.com/devicelogin and enter the code <CODE> to authenticate.
Would you like to run as a persistent service/daemon? [Y/n]: n
[2020-04-17 07:10:18.108 CLI I] Will run as a process
[2020-04-17 07:10:18.623 CLI I] A VSO plan is required to register your environment.
[2020-04-17 07:10:18.627 CLI I] Using VSO plan: /subscriptions/<subscriptionid>/resourceGroups/vso-rg-a7ff597/providers/Microsoft.VSOnline/plans/vso-plan-westeurope
Enter an environment name or blank to use [<MachineName>]:
[2020-04-17 07:10:27.634 CLI I] Creating your environment...
[2020-04-17 07:10:29.153 CLI I] Saving local configurations...
[2020-04-17 07:10:29.321 CLI I] Authenticating...
[2020-04-17 07:10:31.936 CLI I] Waiting for environment to become available...
[2020-04-17 07:10:38.483 CLI I] All done! Connect: https://online.visualstudio.com/environment/<environmentid>

Now if you open up Visual Studio Online, you should see your own machine:

To remove your machine again, you can use ‘VSO stop’:

[2020-04-17 07:33:47.818 CLI I] vso v1.0.2005.59801 (pid: 8792)
Shared workspace not found: True
[2020-04-17 07:33:48.524 CLI I] Authenticating...
To sign in, use a web browser to open the page
https://microsoft.com/devicelogin and enter the code <CODE> to authenticate.
[2020-04-17 07:34:34.313 CLI I] Authentication successful. Removing your environment.
[2020-04-17 07:34:35.340 CLI I] Sucessfully removed your local environment.

Thursday, April 16, 2020

Track timings using Serilog

So far I always used the Stopwatch class to track timings in my applications and just added the result to my log message. Until I discovered the SerilogTimings nuget package.

Usage is simple, after you have configured Serilog, you can use Operation.Time() to time an operation:

At the completion of the using block(!), a message will be written to the log like:

[INF] Submitting payment for order-12345 completed in 456.7 ms
You can also use it directly on top of an ILogger instance:

More info about this library can be found here: https://github.com/nblumhardt/serilog-timings

Wednesday, April 15, 2020

XUnit–.NET Standard

After creating a new project in Visual Studio to use for my unit tests, I added the following NuGet packages:

<PackageReference Include="xunit" Version="2.4.0" />
< PackageReference Include="xunit.runner.visualstudio" Version="2.4.0" />

However adding the second package resulted in the following warning message:

Package 'xunit.runner.visualstudio 2.4.0' was restored using '.NETFramework,Version=v4.6.1, .NETFramework,Version=v4.6.2, .NETFramework,Version=v4.7, .NETFramework,Version=v4.7.1, .NETFramework,Version=v4.7.2, .NETFramework,Version=v4.8' instead of the project target framework '.NETStandard,Version=v2.0'. This package may not be fully compatible with your project.

So what did I do wrong? I wasn’t really thinking when creating my new project and my muscle memory made me click on a .NET Standard Class library. This is of course a great choice if you just want to build a library, but not a good fit for our unit tests where we should target a specific framework(.NET or .NET Core).

After changing the target framework to .NET Core the warning disappeared.

<PropertyGroup>
   < TargetFramework>netcoreapp3.0</TargetFramework>
< /PropertyGroup>

More info about this can be found here:

https://xunit.net/docs/why-no-netstandard

Tuesday, April 14, 2020

ASP.NET Core–Get raw URL

To handle a specific use case I had to capture the current URL in ASP.NET Core. I found a lot of outdated answers on the Internet so here is the correct way in case I forget(which I certainly will do):

In the Microsoft.AspNetCore.Http.Extensions assembly a static UriHelper class exists that allows you to get the URL from an HttpRequest object:

var url= UriHelper.GetEncodedUrl(httpContext.Request);

This GetEncodedUrl() method can also be used as an extension method directly on the HttpRequest(don’t forget to import the namespace):

var url= httpContext.Request.GetEncodedUrl();

Maybe it helps you too!

Wednesday, April 8, 2020

SQL Server Query Store–How to use it in older SQL versions

One of the (not-so hidden) gems in SQL Server is the availability of the Query Store. Unfortunately you need SQL Server 2016 or higher to be able to use this feature.

The good news is that there is an alternative for older SQL Server versions; OpenQueryStore.

OpenQueryStore (OQS) is a collection of scripts that add Query Store like functionality to pre-SQL Server 2016 Instances!

OQS is being built from the ground up to allow all versions and editions of SQL Server from 2005 up to and including 2014 to have a Query Store like functionality. The data collection, retention and cleanup will be easily configurable to allow for complete control of the OQS data storage.

Installation

  1. Download the latest release of Open Query Store from the releases page and unzip the file.

  2. Open a PowerShell console and navigate to the location of the unzipped files. Copying the command below change the values of <Instance>, <dbName> and <path> and OQS will be installed for you.

.\Install.ps1 -SqlInstance <Instance> -Database <dbName> -OQSMode Centralized -SchedulerType "Service Broker" -CertificateBackupPath "<path>"

Configuration

After the installation is done, you have to enable the data collection by setting the collection_active bit to true in the [oqs].[collection_metadata] table. When enough data is collected you can check the generated data through one of the available reports.

Reports

The OQS Dashboards can be viewed directly from SSMS (v16. or V17.).

  1. Download the OpenQueryStoreDashboard.rdl and OpenQueryStoreWaitStatsDashboard.rdl files from the GitHub page and store it on a machine with SSMS installed.
  2. Right-click on a user database that has the OQS enabled and select "Reports - Custom Reports".
  3. Navigate to the location were you stored the .rdl files and open it.

Tuesday, April 7, 2020

Dotnet format

I find code consistency important. Naming conventions, code formatting, … should all be aligned to make the code readable and consistent.

In .NET you can force this consistency through the .editorconfig file. (If you don’t have one in your projects, please stop reading and go add one first). This is not the first time I’m mentioning the .editorconfig file:

Today I want to take it one step further and enforce the coding style through our build pipeline. We’ll do this through a dotnet cli tool dotnet-format.

Let’s first install it:

dotnet tool update -g dotnet-format

Now you can browse to your solution or project folder invoke the tool using:

dotnet format

This will apply the code formatting rules defined in our .editorconfig to our code.

If we want to use it inside our builds, we probably don’t want to change the code itself. Instead we only want to check if the rules are followed. This can be done by adding an extra --check and --dryrun parameter:

dotnet format --check –dryrun

This is how the output should look like:

C:\Projects\mestbankportaal\MestbankPortaal\Mestbank.Core>dotnet format --check --dry-run
  Formatting code files in workspace 'C:\Projects\mestbankportaal\MestbankPortaal\Mestbank.Core\Mestbank.Core.csproj'.
  Warnings were encountered while loading the workspace. Set the verbosity option to the 'diagnostic' level to log warnings.
  Domain\IE_ROL_FUNCTIE.cs(3,35): Fix whitespace formatting.
  Services\GebruikerService.cs(25,37): Fix whitespace formatting.
  Services\GebruikerService.cs(25,38): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(27,39): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(27,64): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(27,66): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(40,50): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(42,62): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(42,77): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(42,79): Fix whitespace formatting.
  Services\MestbankAuthorizationService.cs(61,9): Fix whitespace formatting.
  Formatted code file 'IE_ROL_FUNCTIE.cs'.
  Formatted code file 'GebruikerService.cs'.
  Formatted code file 'MestbankAuthorizationService.cs'.
  Format complete in 3695ms.

A non-zero exit code is returned if any files would have been changed. Thanks to this we can easily use it in a dotnetcli task on our build agent.

Monday, April 6, 2020

Pluralsight–Stay home and skill up for free

With the current lockdown in most countries around the world, a lot of people have to work from home. To keep growing your skills, Pluralsight is offering free access during the entire month of April. Anyone that does not have a current subscription to Pluralsight can take advantage of this offer. No credit card is required to sign up and there will be no obligation beyond April.

Go here to subscribe and enjoy your free access in April.

Friday, April 3, 2020

Expose Kibana to the outside world

By default Kibana is configured to only be accessible from ‘localhost’. If you want to expose it outside your server, you’ll have to update the configuration.

  • Go to the config folder and open the kibana.yml file.
  • Find the following section:

# Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values.

# The default is 'localhost', which usually means remote machines will not be able to connect.

# To allow connections from remote users, set this parameter to a non-loopback address.

# server.host: "localhost"

  • Uncomment the ‘server.host’ section and specify a non-loopback address. An example:

server.host: "0.0.0.0"

  • Don’t forget to also open up the necessary ports in your firewall(by default port 5601)

Thursday, April 2, 2020

Installing Kibana as a windows service

For Kibana(part of the ELK) stack no out-of-the-box script exists to install and run it as a windows service. (This in contrast to ElasticSearch which has a batch file that allows you to install it as a windows service).

As a workaround you can use NSSM; the Non-Sucking Service Manager.

With NSSM you can take any executable and run it as a windows service.

Here are the steps to use it:

  • Download NSSM.
  • Extract the zip and put the nssm.exe executable on a location of your choice.
  • Run nssm install <servicename> ; .e.g. nssm install kibana
  • This will open up a configuration window where you can specify the executable you want to run and configure some other service related settings

  • Click on Install service.

Wednesday, April 1, 2020

ElasticSearch– was created with version [5.3.0] but the minimum compatible version is [6.0.0-beta1]. It should be re-indexed in Elasticsearch 6.x before upgrading to 7.6.1.

As mentioned in some of the previous posts I am migrating an ‘old’ 5.3 instance of ElasticSearch to 7.6.1. In my first (too optimistic) attempt I directly migrated to ElasticSearch 7.6.1.

When starting the ElasticSearch cluster this resulted in the following error message:

[myindex/gTTtwHT3ShiAz-eR94eKmw]] was created with version [5.3.0] but the minimum compatible version is [6.0.0-beta1]. It should be re-indexed in Elasticsearch 6.x before upgrading to 7.6.1.

               at org.elasticsearch.cluster.metadata.MetaDataIndexUpgradeService.checkSupportedVersion(MetaDataIndexUpgradeService.java:113)

               at org.elasticsearch.cluster.metadata.MetaDataIndexUpgradeService.upgradeIndexMetaData(MetaDataIndexUpgradeService.java:87)

               at org.elasticsearch.gateway.GatewayMetaState.upgradeMetaData(GatewayMetaState.java:240)

               at org.elasticsearch.gateway.GatewayMetaState.upgradeMetaDataForNode(GatewayMetaState.java:223)

               at org.elasticsearch.gateway.GatewayMetaState.start(GatewayMetaState.java:154)

               at org.elasticsearch.node.Node.start(Node.java:705)

at org.elasticsearch.bootstrap.Bootstrap.start(Bootstrap.java:273)

               at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:358)

at org.elasticsearch.bootstrap.Elasticsearch.init(Elasticsearch.java:170)

               at org.elasticsearch.bootstrap.Elasticsearch.execute(Elasticsearch.java:161)

               at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:86)

               at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:125)

               at org.elasticsearch.cli.Command.main(Command.java:90)

               at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:126)

               at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:92)

For complete error details, refer to the log at C:\Program Files\ElasticSearch\7.6.1\logs\elasticsearch.log

The trick is to first migrate to ElasticSearch 6.8 and Kibana 6.8. Kibana 6.8 offers an update wizard which guides you through the required steps to upgrade your indices to a supported format.

Here are the steps:

  • Go to Kibana
  • Click on the Management icon on the left
  • Click on the 7.0 Upgrade Assistant
  • Check all the listed issues and apply the suggested solutions
  • When everything is fixed you can safely install ElasticSearch 7.x