Friday, September 29, 2017

Enabling Application Insights on an existing project

Yesterday I lost some time searching how to Enable Application Insights on an existing project in Visual Studio.

I thought it was available on the context menu when you right click on your Visual Studio project, but no option found there:


Turns out you need to go one level deeper Confused smile;

  • Right click on your project
  • Click on Add and select Application Insights Telemetry…


Now you can go through the configuration wizard by clicking on Start Free:


Thursday, September 28, 2017

Visual Studio 2017 Offline install - “Unable to download installation files”

After creating an offline installer for Visual Studio 2017 using vs_enterprise.exe --layout c:\vs2017offline we were ready to install Visual Studio on our build servers(which don’t have Internet access).

However when we tried to run the installer, it failed after running for a few minutes with the following error message:

Unable to download installation files

Unable to download install files

This error message was not that useful as we found out that the problem was not related to missing installation files but due to the fact that we forgot to install the required certificates first.

To install the certificates first, you have to

  1. Browse to the "certificates" folder inside the layout folder you created(e.g. c:\vs2017offline\certificates)

  2. Right-click each one and choose Install PFX.

  3. Specify Local machine as target certificate store
  4. Leave the password field empty

More information:

Wednesday, September 27, 2017

Team Foundation Server–Upgrade your build agents

If you upgrade your TFS installation to a newer version, a new version of the build agent is available as well.

To upgrade your agents, you have 2 options:

  • If a new major version of the agent is released, you’ll have to manually delete the old agent and install a new agent.
  • If a new minor version of the agent is released, the existing agent is upgraded automatically when it runs a task that requires a newer version of the agent.
    • If you want to trigger the update manually, you can go to the Agent Pool hub, right click on a Queue and click on Update All Agents.


More information at

Tuesday, September 26, 2017

NPGSQL–Relation does not exist

PostgreSQL has great .NET support thanks to the open source NPGSQL library.

From the home page:

Npgsql is an open source ADO.NET Data Provider for PostgreSQL, it allows programs written in C#, Visual Basic, F# to access the PostgreSQL database server. It is implemented in 100% C# code, is free and is open source.

In addition, providers have been written for Entity Framework Core and for Entity Framework 6.x.

However I immediately had some problems the moment I tried to execute a query. Strange thing was that my code almost the same as what could be found on the Getting Started page;

The error I got was the following:

Query failed: ERROR: relation "Northwind.Products" does not exist

I tried to execute the same query directly in the PGAdmin tool and indeed I got the same error.

What am I doing wrong? The problem is that PostgreSQL by default implicitly converts unquoted identifiers in my query to lowercase.

So the following query;

SELECT Id, ProductName FROM Northwind.Products

was transformed to

SELECT id, productname FROM northwind.products

As object names are case sensitive in PostgreSQL(or so it seems), this resulted in the fact that my table was not found.

There are 2 possible solutions:

  • Use quotes around your identifiers: SELECT “Id”, “ProductName” FROM “Northwind”.”Products”
  • Change the casing of your database objects(tables, columns, …) to lowercase

I choose the last option, because I had to escape my query in string in my C# code otherwise.

Monday, September 25, 2017

TypeScript error–Property ‘assign’ does not exists on type ‘ObjectConstructor’

A colleague asked me for help when he got into trouble with his TypeScript code. Here is a simplified version:

Although this looks like valid code, the TypeScript compiler complained:


After some headscratching, we discovered that there was a “rogue” tsconfig.json at a higher level that set “ES5” as the target. Object.Assign was added as part of “ES6” explaining why TypeScript complained.


After changing the target to “es6”, the error disappeared.

Friday, September 22, 2017

VSWhere.exe–The Visual Studio Locator

As someone who has built a lot if CI and CD pipelines, one of the struggles I always got when new Visual Studio versions were released was how to make my build server use the correct version of MSBuild when multiple Visual Studio versions were installed.

It got a lot better over the years, but even recently I was sitting together with a customer to investigate how we could make the build server understand that the Visual Studio 2017 Build tools should be used.

One of the (badly documented) tricks you could use was scanning the registry for specific registry keys. Luckily Microsoft released recently a new tool that makes finding your Visual Studio instances a lot easier: vswhere.exe

From the documentation:

vswhere is designed to be a redistributable, single-file executable that can be used in build or deployment scripts to find where Visual Studio - or other products in the Visual Studio family - is located. For example, if you know the relative path to MSBuild, you can find the root of the Visual Studio install and combine the paths to find what you need.

You can emit different formats for information based on what your scripts can consume, including plain text, JSON, and XML. Pull requests may be accepted for other common formats as well.

vswhere is included with the installer as of Visual Studio 2017 version 15.2 and later, and can be found at the following location: %ProgramFiles(x86)%\Microsoft Visual Studio\Installer\vswhere.exe. The binary may be copied from that location as needed, installed using Chocolatey, or the latest version may be downloaded from the releases page. More information about how to get vswhere is on the wiki.

This tool is also used internally in the VSBuild build task in TFS to discover recent Visual Studio versions(2017 and newer).

A quick sample:

  • Open a command prompt
  • Browse to %ProgramFiles(x86)%\Microsoft Visual Studio\Installer\ or the location where you downloaded vswhere.exe.
  • Let’s try vswhere –latest


Thursday, September 21, 2017

.NET Standard: Using the InternalsVisibleToAttribute

In .NET Standard projects, there is an AssemblyInfo class built-in, so you no longer need a separate AssemblyInfo.cs file in your project Properties.

But what if you want to use the InternalsVisibleToAttribute? This was one of the attributes I used a lot to expose the internals of my Assembly to my test projects.

Turns out that it doesn’t matter really where you put this attribute. It is applied at the assembly level, so you can include in any source code file you like. Using the AssemblyInfo file was just a convenience.

So what I did, was creating an empty .cs file and add the following code:

Wednesday, September 20, 2017

.NET Standard: Duplicate 'System.Reflection.AssemblyCompanyAttribute' attribute

In a ‘classic’ .NET project, you have an AssemblyInfo.cs file.


This file contains all kind of information about your assembly

After upgrading a classic .NET project to .NET Standard, I started to get errors about some of the properties inside the AssemblyInfo.cs file:


A .NET Standard project already has the AssemblyInfo information built-in. So after upgrading you end up with 2 AssemblyInfo specifications, leading to the errors above.

The solution is to remove the original AssemblyInfo.cs file in the Properties folder.

Remark: If you want to change the assembly information, you now have to use the Package tab inside your Project Properties.


Tuesday, September 19, 2017

RabbitMQ–Configure access to Management portal

As mentioned in a previous post, it is probably a good idea to enable the RabbitMQ Management plugin to help you track what’s going inside your service broker.

Now if you try to access the Management plugin using the default guest account(which you should probably remove), outside the server itself, you get a ‘Login failed’ error.


Let’s fix this:

  • Logon to the server
  • Browse to the management portal using the localhost address: http://localhost:15762
  • Logon using the guest account


  • Click on the Admin tab and scroll to the Add a user section
    • Enter a Username and Password
    • Specify one or more Tags as a comma separated list. If you want to give full access, enter ‘administrator’.
    • Click on the Add user button


  • Now click on the newly created user in the user list


  • The set permission section is shown


  • Leave the default settings and click Set permission.
  • That’s it!

Remark: You can do the same steps using the command line tooling. For example, if you want to set the user tags, you can use

$ rabbitmqctl set_user_tags yourName administrator

Monday, September 18, 2017

RabbitMQ–Enable Management plugin

To simplify management and monitoring of your RabbitMQ Service Broker it is a good idea to install the management plugin(don’t expect anything fancy).

  • To install it, logon to the server where you installed RabbitMQ
  • Open a RabbitMQ command prompt


  • Enter the following command
    • rabbitmq-plugins enable rabbitmq_management
  • You’ll get the following log output

D:\Program Files\RabbitMQ Server\rabbitmq_server-3.6.12\sbin>rabbitmq-plugins en

able rabbitmq_management

The following plugins have been enabled:







Applying plugin configuration to rabbit@SERVER01... started 6 plugins.


  • If you want to access the portal from outside the server, you have to configure a firewall rule that allows TCP traffic on port 15672.


  • Remark: Notice that when you try to access the management portal from outside the server using the default guest account, it will not work. This is a security feature that is enabled by default. To solve that, we’ll create another account, but that’s something we cover in another blog post.

Friday, September 15, 2017

ASP.NET Core–Configuring a WCF service

In an ASP.NET Core application(using the full .NET framework) we had to consume a WCF service.

Should be easy right? Unfortunately it turned out that be more work than I expected. In a first post I explained the steps how to generate a Client Proxy, this post is about  setting the configuration.

WCF configuration can be a daunting beast with a lot of options and things that can go wrong. The code generated by the proxy hardcodes (some part) of the configuration in the WCF proxy and provides you a partial method to override it but that’s not the approach we want to take.

I know we’ll host the WCF service in IIS, so adding a web.config and putting the configuration logic over there sounds nice…

Let’s try that:

  • Open the generated proxy reference file  and remove the call to Service1Client.GetDefaultBinding() and Service1Client.GetDefaultEndpointAddress() in the constructor. (Note: this is only for testing purposes)


  • Right click on your ASP.NET Core project and add a web.config file.
  • Right click on the web.config and choose Edit WCF configuration.


  • Go to the Client section and choose Create a New Client…


  • Follow the steps through the Wizard. After completing it you should have something like this inside your web.config:
  • Let’s now try to create a client proxy instance and execute a call:
  • Unfortunately, this didn’t work and we end with an exception when we try to run our application:


  • If that doesn’t work, where should we put this configuration? (And yes, I know I can do everything through code but that is not what I want here).
    • An ASP.NET Core project is an executable behind the scenes. The only thing that IIS does is forward the request to Kestrel that invokes the DotNet process.
    • This executable has its own configuration file that is generated for you out of the box behind the scenes.


  • If you want to change this config, you have to add an app.config instead of a web.config to your project. Let’s just rename the file, rebuild our project and try again…


  • This time it works!


Thursday, September 14, 2017

ASP.NET Core–Connecting to a WCF service

In an ASP.NET Core application(using the full .NET framework) we had to consume a WCF service.

Should be easy right? Unfortunately it turned out that be more work than I expected.

  • I right clicked on my project and searched for an Add service reference… option. No luck, instead I saw a Connected Services section. Maybe that will do it?


  • I right clicked on the Connected Services section and choose Add Connected Service.


  • This opened up the Connected Services window but no option was available to connect to an existing WCF service Sad smile


  • Maybe the Find more services… link at the button will help me? This brought me to the Visual Studio Marketplace. And yes… a search for ‘WCF’ showed up a Visual Studio Connected Services plugin that allows to add a WCF Web service reference to .NET Core projects. Exactly what I needed.


  • I clicked on Download, closed Visual Studio after which the installer appeared and I could install the extension.
  • After the installation has completed, we can open up Visual Studio again, try Add Connected Services again. This time we see a 3th option appear:
    • Microsoft WCF Web Service Reference Provider – Preview


  • Click on it and you’ll get the same options you had before when using Add service reference…


Wednesday, September 13, 2017


I’m currently working at a client where are (finally) migration from DB2 to SQL Server. One of the things we encountered is that DB2 is using a different precision(6 digits) for their DateTime than SQL Server, so as part of the migration process we change all target dates on SQL Server to DateTime2 to not loose any data.

After migrating everything seemed to work until we tried to save an object through NHibernate to the database; we always got a StaleObjectStateException.

Problem was that we were using one of these DateTime columns for concurrency checks. As NHibernate by default expects a DateTime instead of a DateTime2 we lost some precision when hydrating the objects from the database. When we later on tries to persist our changes, the concurrency check will see that the DateTimes are different resulting in a StaleObjectStateException.

The solution was to change our mapping code to use DateTime2 instead.

Here is our (updated) Fluent NHibernate code:

And here is a similar example using the NHibernate XML mapping:

Remark: We also had a problem with the difference in precision in DB2(6 digits) and SQL Server (7 digits) but that is maybe for another post…

Tuesday, September 12, 2017

Can I retarget my libraries to .NET Standard 2.0?

With the release of .NET Core 2.0 and the .NET Standard 2.0 specification, it’s time to check if I can retarget some of my old libraries to .NET Standard 2.0.

The tool you need is the .NET Portability Analyzer:

After downloading and installing the Visual Studio extension, it is time to configure it first:

  • Open the project you want to analyze in Visual Studio
  • Go to Tools –> Options and click on the .NET Portability Analyzer from the left menu


  • Select your Target Platforms and the Output formats of the generated report and click OK.
  • Now you can right click on a specific project or your solution and choose Analyze Assembly/Project Portability.


  • After the analysis has completed you’ll get a report that contains a nice summary, a long list of details and a list of missing assemblies:




Monday, September 11, 2017

JSON.NET–Using a Custom Contract Resolver without loosing CamelCasing

For a project I’m working we created a custom ContractResolver to apply some localization magic before JSON data is send to the client.

Here is the code we are using:

Let’s try this code:


As you can see the ContractResolver does its job, only problem is that we loose the CamelCasing. Here is how you can fix it:

Let’s run our code again:


Friday, September 8, 2017

How to start using C# 7.1?

Today I wanted to use C# 7.1 to take advantage of the new async main functionality.

However I couldn’t find immediatelly where to activate it.

Let’s walk through the steps:

  • Right click on your project and choose Properties


  • Go to the Build tab and click on the Advanced… button at the bottom


  • Now you can either choose C#7.1 from the list or select C# latest minor version (latest) to always use the latest version.


  • After clicking OK, you can start using the new C# 7.1 features:


Thursday, September 7, 2017

Marten–Soft Deletes

On one of my projects we are using Marten, which provides a Document Store and Event Store api on top of PostgreSQL.  Behind the scenes it uses the powerfull JSON functionality that is built into the PostgreSQL database engine. The fact that we are still using an ACID compliant database makes it all a lot easier.


One of the features that Marten supports are ‘soft-deletes’. This means that documents are never actually deleted from the database. Marten will automatically filter out documents marked as deleted unless you explicitly state otherwise in the Linq Where clause.

However when requesting a specific document I noticed that I still got my deleted document back;

var document=await session.LoadAsync(id);

If I used the query syntax instead the specific document was filtered out as expected:

var documents=await session.Query().Where(s => ..);

The following GitHub issue brought some insights: . This functionality is by design, if you know the id of the document you’ll get it back. Load will not respect the deleted flag.

Good to know!

Wednesday, September 6, 2017

NPM–Specifying a different registry

I already talked about the NPM registry before. Today I want to share another trick I discovered when looking in the SignalR documentation;

The JavaScript client is being published to our dev npm registry as @aspnet/signalr-client. The module contains a browserfied version of the client. You can install the module as follows:

  • Create an .npmrc file with the following line: @aspnet:registry=
  • Run: npm install @aspnet/signalr-client

I wasn’t aware of the existance of an .npmrc file.

NPM gets its config settings from the 3 locations:

  • command line
  • environment variables
  • npmrc files.

The nice thing is that you can create a .npmrc file at multiple levels that will be picked up when executing NPM commands. You can set a .npmrc file

  • per-project config file (/path/to/my/project/.npmrc)
  • per-user config file (~/.npmrc)
  • global config file ($PREFIX/etc/npmrc)
  • npm builtin config file (/path/to/npm/npmrc)

So if we want that NPM uses a different registry for a specific project(which is something we want for the SignalR package as it is not part of the official NPM repo yet), we can create a .npmrc file at the project level and specify a @scope and a related registry. Inside our commands we can than use this scope to point to a specific registry

    npm install @aspnet/signalr-client


    Tuesday, September 5, 2017

    Git: Remote origin already exist

    Yesterday when trying to execute the following command in Git

    git remote add origin

    I got the following error message

    fatal: remote origin already exists.

    Problem was that my Git repository was already linked to a remote named “origin”. “origin” is just a naming convention used by Git to indicate your ‘master’ repo.

    To solve the error you have 2 options:

    • Create a different remote with another name:
    • Remove the existing remove first and retry the original command afterwards
      • git remote rm origin

    Monday, September 4, 2017

    Upgrading to .NET Standard 2.0

    After installing Visual Studio 2017 Update 15.3 and the .NET Core 2.0 SDK I thought I was finally ready to ‘upgrade’ some of my projects to .NET Standard 2.0.

    I right clicked on my project, selected Properties and tried to change the Target Framework on the Application tab. Unfortunately I didn’t see a .NET Standard 2.0 option in the list.

    OK, let’s try a different approach. I right clicked on the project and choose Edit .csproj file and updated the TargetFramework to netstandard2.0 directly in the project file



    This worked but resulted in a set of strange error messages when I tried to compile a .NET Core project that was referencing this library.

    In the end it turned out that I had a global.json file that was the root cause of my problems. The global.json file defines which version of the .NET Core SDK to use:

      "sdk": {
        "version": "1.0.0"
    When you run dotnet new or dotnet build, the dotnet host looks in the current folder,and all parent folders for a global.json. If a global.json exists (and the SDK version it references exists!) then that version will be used for all SDK commands. If not, it will just use the newest version of the SDK.

    I removed the global.json and not only did my build errors disappeared, when I opened the project properties again I could now select .NET Standard 2.0:


    Friday, September 1, 2017

    Test Impact Analysis is back!

    A loooong time ago in Visual Studio 2010 Microsoft introduced a great new feature Test Impact Analysis. Test Impact Analysis tries to predict based on your code changes which tests should be executed.


    In more recent versions of Visual Studio this feature disappeared…until now. Version 2 of the Visual Studio Test task reintroduces the “Run only impacted tests” checkbox. Checking this checkbox will automatically configure the Test Impact data collector to identify and run only the impacted tests.


    Remark: If you don’t see this option, check that you are not still using version 1 of the Visual Studio Test task:


    More information: