Friday, June 28, 2019

Tips from NDC Oslo 2019–Flexbox Froggy

I have to admit that I’m really bad in CSS (although I was really proud on the linear-gradient I created last week). Before I always used the grid layout system of Bootstrap and didn’t worry to much about it. But I feel that the time is right to really start understanding Flexbox and the newer Grid Template.

And what is a better way than through gamification. So stop disturbing me as I’m trying to get a frog on a lilypad

Thursday, June 27, 2019

Tips from NDC Oslo 2019 - .NET Core ReadyToRun

In the Full .NET framework you had the concept of native images generated using Ngen.exe(Native Image Generator). By creating a native image, you can improve the startup time and performance of managed applications. A native image contains compiled processor-specific machine code. This image is then installed into the native image cache on your local computer. The runtime can use native images from the cache instead of using the just-in-time (JIT) compiler to compile the original assembly.

With the upcoming release of .NET Core 3.0, a similar feature will appear in .NET Core through ReadyToRun(R2R) image. R2R is a form of ahead-of-time (AOT) compilation.

R2R binaries improve startup performance by reducing the amount of work the JIT needs to do as your application is loading. The binaries contain similar native code as what the JIT would produce, giving the JIT a bit of a vacation when performance matters most (at startup). R2R binaries are larger because they contain both intermediate language (IL) code, which is still needed for some scenarios, and the native version of the same code, to improve startup.

In contrast to NGEN where compilation must be done on client machines, R2R images are generated using crossgen, as part of your build and are “ready to run” without any additional work on client machines.

For more details and some performance benchmarks, read more here:

Wednesday, June 26, 2019

Tips from NDC Oslo 2019 - Azure Cosmos DB Data Migration tool

If you want to start using Azure Cosmos DB, you probably want a way to get some existing data from multiple sources into Azure Cosmos DB.

No worries, this is all possible through the Azure Cosmos DB Data Migration tool. You can import from JSON files, CSV files, SQL, MongoDB, Azure Table storage, Amazon DynamoDB, and even Azure Cosmos DB SQL API collections.

You can download the migration tool source code from this repository on GitHub. Or if you don’t want to compile the code yourself, you can download a pre-compiled binary.

To get started, you can follow the tutorial here:

Tuesday, June 25, 2019

ASP.NET Core 2.2 free ebook

Just in time for the summer holidays released Syncfusion another free ebook. This time on ASP.NET Core 2.

Download it for free from the Free Ebooks section of Syncfusion Tech Portal.

Monday, June 24, 2019

Nice quote–Entrepreneurship

A quote by Reid Hoffman, the founder of LinkedIn:

“If you are not embarrassed by the first version of your product, you've launched too late.”

Reid Hoffman

Friday, June 21, 2019

Tips from NDC Oslo 2019–The C stands for Cascade

People seems to forget the power of CSS. If you can’t remember how CSS selectors exactly work and why the cascading part is so important, go and improve your CSS skills using CSS Diner. You’ll have 32 levels to go through and solve every challenge by writing the correct CSS selector:

Thursday, June 20, 2019

Tips from NDC Oslo 2019–SpendOps with Azure Cosmos DB

During a presentation today about Azure Cosmos DB, the cost aspect of Cosmos DB was mentioned. Although Cosmos DB doesn’t have to be expensive, you hear a lot of horror stories. The challenge is in choosing the right settings for your use case or scenario.

To help you keep the cost under control the SpendOps with Azure Cosmos DB post by William Liebenberg was mentioned. In this post he explains an approach where an Azure DevOps Release gate is used to check the cost of a particular feature or function . When it is higher than expected  the release is blocked.

Here is the related presentation video:

Wednesday, June 19, 2019

Postman 7.2–GraphQL support

Although it was possible to use GraphQL with Postman, it wasn’t really supported out-of-the-box… until today. With the release of Postman 7.2 GraphQL is now officially supported. Sending GraphQL queries in the request body, using GraphQL variables, and GraphQL query autocompletion, … it all becomes available inside Postman.

Getting started

  • Create a new request in Postman
  • Enter a GraphQL endpoint for the url e.g.
  • Select POST as your HTTP method
  • Go to the Body tab and select GraphQL
  • Select the GraphQL radio button
  • Now you can enter your GraphQL query and hit SEND:

Notice that the UI mentions 'no schema' and that you don’t get autocompletion out-of-the-box. To enable this you have to add the GraphQL API schema to Postman:

  • Go to the APIs tab in Postman
  • Click on the + New API button on the left
  • Name your API
  • Select GraphQL as your spec from the dropdown menu on the Define tab
  • Add your schema and save

More information:

Tuesday, June 18, 2019

Tips from NDC Oslo 2019 - Brighter

I’m currently attending NDC Oslo and wow is there a lot to learn!

A tool mentioned during one of the talks was Brighter:

I still have to try it out myself, but the support for Task Queues triggers my curiosity…

Monday, June 17, 2019

Tips from NDC Oslo 2019 - F# Interactive- #time

During the workshop I followed I learned the following handy directive #time that can be used in F# Interactive.

Executing the #time directive will enable basic profiling.

Let’s try the code below in F# interactive(FSI):

This will produce the following output:

--> Timing now on

Real: 00:00:00.000, CPU: 00:00:00.000, GC gen0: 0, gen1: 0, gen2: 0
val strings : string [] = [|"Machine"; "Learning"; "with"; "F#"; "is"; "fun"|]
val lengths : int [] = [|7; 8; 4; 2; 2; 3|]


We get the real time and CPU time, as well as some information about garbage collection in generations 0, 1 and 2.

To disable it again you can execute the #time directive another time.

Friday, June 14, 2019


Can you do the following;

  • Open Google Chrome
  • Enter the following url in your addressbar chrome://net-export

You should see this:

Congratulations, you’ve found the Network log tool. This allows you to create a log file of the browser's network-level events and state.

Step-by-step guide

  1. Open a new tab and go to chrome://net-export/
  2. Click the Start Logging To Disk button.
  3. Reproduce the network problem in a different tab (the chrome://net-export/ tab needs to stay open or logging will automatically stop.)
  4. Click Stop Logging button.

Afterwards you can view the log file using

Thursday, June 13, 2019

Azure Pipelines- Agentless/Server jobs

While demonstrating some new Azure DevOps (TFS) functionalities to a customer, I got the following question; “What are agentless jobs?”.

Good question, let’s try to give an answer:

In Azure Pipelines there are 4 types of jobs:

  • Agent pool jobs run on an agent in an agent pool.
  • Server jobs run on the Azure DevOps server.
  • Container jobs run in a container on an agent in an agent pool.
  • Deployment group jobs run on machines in a deployment group. These jobs are only available in a release pipeline.

Our question is related to the second type of jobs; server jobs also known as agentless jobs.

Where most of the jobs are executed by a build agent, server jobs are executed directly on the Azure DevOps application server.  This means that a server job does not require an agent or any target computers.

Creating an agentless job

To create an agentless job, open up a build pipeline. Click on the … button at the pipeline level and choose Add an agentless job from the dropdown menu.

Now a new agentless job phase is added to your build pipeline.

Inside this phase, you can add tasks similar to the other phases.

Remark: Notice that only a limited set of tasks is supported in the Agentless job phase.

Wednesday, June 12, 2019

Error MSB4018: The "SqlBuildTask" task failed unexpectedly. Microsoft.Isam.Esent.Interop.EsentVersionStoreOutOfMemoryException:

A customer contacted me about a problem on their build servers. The build for one of their applications failed sometimes with the following error message:

C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\SSDT\Microsoft.Data.Tools.Schema.SqlTasks.targets(550,5): Error MSB4018: The "SqlBuildTask" task failed unexpectedly. Microsoft.Isam.Esent.Interop.EsentVersionStoreOutOfMemoryException: Version store out of memory (cleanup already attempted) at Microsoft.Isam.Esent.Interop.Api.Check(Int32 err)

The solution contained a SQL Server Database project, that resulted in the error above. When we removed the database project from the build configuration, the error disappeared.

Further investigation brought us to a final solution; an update of the SQL Server Data Tools on the Build server solved the problem.

Tuesday, June 11, 2019

Azure App Services–Enable HTTP2

Although support for HTTP/2 was introduced in Azure App Services more then a year ago, HTTP 1.1 is still the default when you create a new App Service.

As HTTP/2 brings a lot of performance improvements, it can be worth to make the switch. Let’s see how to enable this:

  • Go to the Azure Portal and select the App Service you want to configure from the list of Resources
  • Click on the Settings –> Configuration item to open the Configuration blade.

  • Click on the General settings and scroll down to the Platform settings

  • Change the HTTP version to 2.0 and click on Save.

Friday, June 7, 2019

Package Manager Console–Error when calling Update-Database

When trying to execute an EF Core migration inside the Package Manager console, I got the following error message:

An error occurred while accessing the IWebHost on class 'Program'. Continuing without the application service provider. Error: Please contact the administrator. The following database migrations are pending:


Unable to create an object of type 'SampleContext'. For the different patterns supported at design time, see

When browsing through the code to investigate the problem I noticed the following code snippet:

This code snippet was introduced to prevent executing migrations outside the local development environment. So maybe my environment was not correct?

I double checked the ASPNETCORE_ENVIRONMENT setting but this was set to ‘local’.

Why did this still fail?

The problem was that I was not running the application normally but using the Package Manager console to execute a migration. The Package Manager will use the default enviroment, explaining why I hit this error message.

To fix it I had to explicitly set the environment variable inside the Package Manager console:


Thursday, June 6, 2019

ASP.NET Core Implicit Cast

While doing a code review I noticed the following code inside an ASP.NET(Core) controller:

Do you notice something special?

No? Let’s give you a hint. Take a look at the return type; ActionResult<IEnumerable<Product>> and then take a look at the type that is actually returned; List<Product>. Although both types don’t match this code compiles and runs perfectly.

What is the magic happening that is happening here? This all works thanks to the fact the ActionResult<T> type supports implicit casting.

Now what happens if we update the code to the following:

This time you get a compiler exception:

Cannot implicitly convert type 'System.Collections.Generic.IEnumerable<eShopExample.Product>' to 'Microsoft.AspNetCore.Mvc.ActionResult<System.Collections.Generic.IEnumerable<eShopExample.Product>>'

Whoops! Why this no longer works?

The reason is that where before we were returning a concrete type (List<T>), now we are returning an interface (IEnumerable<T>). C# doesn't support implicit cast operators on interfaces.

To fix this we can add a ToList() to convert the interface to a concrete type.

Wednesday, June 5, 2019

Azure DevOps Server –Build statuses

After migrating to Azure DevOps Server 2019, I got stuck when trying to reconfigure a XAML build controller. Instead of replacing an existing build controller I got an error explaining that the build controller I was using had still a build in progress.

Time to dig into the Azure DevOps database to found out what build I should cancel…

The ‘go to’ table is tbl_BuildQueue. Every build in the queue has a status.

Here are all the statuses for tbl_BuildQueue

  • 0 – None
  • 1 – In progress
  • 2 – Queued
  • 4- Postponed
  • 8 – Completed
  • 16 – Cancelled

Create an update query that set all ‘In progress’ builds to Canceled:

UPDATE tbl_BuildQueue SET [Status]=16 where [Status]=1

If this doesn’t help you can also check the tbl_Build table.

Here are all the statuses for tbl_Build:

  • 1 – In progress
  • 2 – Succeeded
  • 4- Partially Succeeded
  • 8 – Failed
  • 16 – Stopped

Let’s update the builds here as well:

UPDATE tbl_Build SET [BuildStatus]=16 where [Status]=1

Tuesday, June 4, 2019

Azure DevOps Build error - The path ‘’ is already mapped to workspace ‘’.

After renaming our collection, we started to get build errors on our build server for the builds that were using TFS Version control. The error we got was the following:

The path ‘c:\build\appname\sources ’is already mapped to workspace ‘BuildServer_123’.

When I looked at the details of the workspace mapping using

tf workspaces /owner:*

I noticed that it was still referring to the old collection instead of the new name. That explained why TFS tried to create the workspace again, resulting in the error message above.

To fix the issue I had to clear the local TFS cache on the build server. This can be done by deleting the content of the following folder:

C:\Users\{BuildServiceAccount}\AppData\Local\Microsoft\Team Foundation\{VersionNumber}\Cache

{BuildServiceAccount} refers to the account that is used to run the build agent and {VersionNumber} refers to the used Azure DevOps version.

Monday, June 3, 2019

Entity Framework Core 2.2 - Query Tags

EF Core 2.2 introduces a new feature called query tags. This makes it easier to find specific queries in log files and various output windows.

Applying a query tag can be done using the TagWith() method:

Using this extension method the query description is always logged together with the query.