Friday, January 31, 2020

Visual Studio 2019 - Container tools

After reading the blog post of Scott Hanselman about the container tools in Visual Studio, I was eager to try this out.

But I couldn’t find the containers window at the location mentioned in the post:

View Menu –> Other Windows –> Containers

Turns out that is a rather new feature that requires Visual Studio 2019 version 16.4 (or higher). After installing the update the menu item should appear:

Now you can start viewing containers, browsing available images, check logs and port mappings, launch a terminal window, … and so on, all directly from within Visual Studio.

Nice!

Thursday, January 30, 2020

Azure DevOps Server–Delete the default wiki

While creating a new Azure DevOps Server project, I accidently created a Project Wiki. Unfortunately there is no option yet to delete this wiki.

The trick is to use the REST api to delete the GIT repo that is used behind the scenes:

That’s it!

Remark: To be able to call this through a tool like POSTMAN, you need to create a PAT and include it in the Authorization header(Postman will do the base64 encoding for you) or use the NTLM support in POSTMAN.

Authorization: Basic PAT

Wednesday, January 29, 2020

NuGet PackageReference–PrivateAssets

Visual Studio 2017 and .NET Core introduced the new csproj format where your NuGet packages are no longer referenced through a Packages.config file but are added to your csproj file:

If you want to create a NuGet package from your project, you no longer need a separate configuration file but your csproj file can contain all the information about your NuGet package. It will automatically include all PackageReferences that you are using, so gone is all the copy/paste work.

But what if you have a dependency that is purely used during development and that you don’t want to expose to projects that will consume your package? In this scenario, you can use the PrivateAssets metadata to control this behavior.

There are multiple possible values that you can find in the documentation: https://docs.microsoft.com/en-us/nuget/consume-packages/package-references-in-project-files#controlling-dependency-assets

Monday, January 27, 2020

Azure DevOps - Scan your repo for vulnerabilities

Almost every (web) application built today is using one or more open-source components. But are you sure that none of the packages that you are using has vulnerabilities?

NPM already helps you by running an NPM audit when you install new packages, but wouldn’t it be nice to go one step further and have everything nicely integrated into your build pipeline?

Enter WhiteSource Bolt for Azure DevOps. It is a FREE extension, which scans all your projects and detects open source components, their license and known vulnerabilities.

It supports the most common programming languages and does continuous tracking of multiple open source vulnerabilities databases like the NVD, security advisories, peer-reviewed vulnerability databases, and popular open source projects issue trackers.

The only limitation in the free version is that you can scan any project up to 5 times a day.

To install it go to the Azure DevOps Marketplace and click on the ‘Get it free’ button (or go to the Marketplace through your local Azure DevOps Server instance).

After installing it you see a new tab in the Pipelines section. First time you open this item, you’ll have to provide some information.

As a last step you need to add the WhiteSource Bolt build task to one or more of your builds:

Friday, January 24, 2020

Open vs Closed layers

A lot of applications even today are build using the layered architecture pattern. Inside a layered architecture, an application is organized into a set of horizontal layers where each layer performs a specific role.

A typical example is a 4 layer architecture with presentation, business, persistence and database layer:

In a layered architecture each layer has its own responsibilities which allows you to have a clean separation of concerns.  Each layer can only talk to the layer directly below it(see the direction of the arrows). As a consequence your presentation layer cannot directly talk to the database layer directly but have to pass through all layers to finally reach the database.

By default all layers are closed. A closed layer means that as a request moves from layer to layer, it MUST go through the layer right below it to get to the layer below that one.

What some people seem to forget is that inside a layered architecture it is also possible to mark layers as open. There can be good reasons why in some situations a certain layer can be skipped.  A good example is in a CQRS architecture where on the write side you have to go through the business and persistence layer to get to the database where on the read side it is ok to skip the business layer and talk to the persistence layer directly:

Thursday, January 23, 2020

ASP.NET Web API Lifecycle–Step by Step

While investigating an issue in one of our (older) applications I was wondering about the exact steps taken to handle a specific request.

So far I’ve always used the following poster:

But thanks to Matthew Jones it even became easier. He walks through every step in much more detail so that you know exactly what is going on. Thanks Matthew!

https://exceptionnotfound.net/the-asp-net-web-api-2-http-message-lifecycle-in-43-easy-steps-2/

Wednesday, January 22, 2020

Mocking the HttpClient with NSubstitute

While reviewing some code I noticed that a project was using both Moq and NSubstitute. It seemed strange to me to use 2 mocking libraries in the same codebase.

Turned out that there was a problem when trying to combine IHttpClientFactory and NSubstitute. The reason is that the IHttpClientFactory returns a concrete type (HttpClient) and that NSubstitute cannot mock its methods (like SendAsync()).

The trick was to create your own HttpMessageHandler instead:

The mocking code became the following:

Tuesday, January 21, 2020

Qualities of a Highly Effective Architect

A must see for every (wannabe) architect:

Monday, January 20, 2020

GitHub - Code navigation

While browsing through a GitHub repository I noticed the following announcement:

At the moment this is available for the following languages:

  • Go
  • JavaScript (beta)
  • PHP (beta)
  • Python
  • Ruby
  • TypeScript (beta)

Clicking on a method shows you the references and allows you to navigate to them:

Nice!

More information: https://help.github.com/en/github/managing-files-in-a-repository/navigating-code-on-github

Friday, January 17, 2020

Azure DevOps Server - Portable PDB distribution

As I mentioned yesterday the story of sharing Portable PDB’s using Azure DevOps Server isn’t complete yet. Azure Artifacts on premise doesn’t support Portable PDB’s (yet) and uploading .snupkg files doesn’t work either. I don’t want to switch back to Full PDB’s.

This means that at the moment of writing the only feasible solution would be to still include the PDB in the main NuGet package. Although this significantly increases the size of the package (and this restore time for projects that consume the package), I don’t see a better alternative.

To include the portable PDB in the nuget package, add the following setting to your csproj file:

  <PropertyGroup>
    <AllowedOutputExtensionsInPackageBuildOutputFolder>$(AllowedOutputExtensionsInPackageBuildOutputFolder);.pdb</AllowedOutputExtensionsInPackageBuildOutputFolder>
  </PropertyGroup>  

Thursday, January 16, 2020

Azure DevOps Server 2019–Portable PDB

I’m quite confused about the current state of Portable PDBs in combination with Azure DevOps Server.
If you look around on the Internet, the recommendation from Microsoft is to use SourceLink support. So all my projects reference the Microsoft.SourceLink.AzureDevOpsServer.Git NuGet package and have the following settings in the project file:
<PublishRepositoryUrl>true</PublishRepositoryUrl>
<EmbedUntrackedSources>true</EmbedUntrackedSources>
I also added a reference to my Azure DevOps server repo:
<ItemGroup>
  <SourceLinkAzureDevOpsServerGitHost Include="server-name" VirtualDirectory="tfs"/>
</ItemGroup>
More information: https://github.com/dotnet/sourcelink
Inside the documentation they mention the following.
Including PDBs in the .nupkg is generally no longer recommended as it increases the size of the package and thus restore time for projects that consume your package, regardless of whether the user needs to debug through the source code of your library or not
OK, so it is not a good idea to include the pdb in the NuGet package. So let’s publish to a file share as it is not possible yet to publish symbols to your local Azure Artifacts instance(as part of your Azure DevOps Server).
I added the Index Sources and Publish Symbols task to my build pipeline.
If you leave the Index sources checkbox checked, the task will output the following:
Skipping: D:\b\5\_work\50\s\ExampleApp\bin\Release\netcoreapp2.2\Example.pdb because it is a Portable PDB
This does make sense as the indexing is already done by sourcelink. But when I take a look at the publishing part no files are stored 😒
2020-01-14T08:58:54.0211016Z ##[debug]SYMSTORE: Number of files stored = 0
2020-01-14T08:58:54.0230550Z ##[debug]SYMSTORE: Number of errors = 0
2020-01-14T08:58:54.0250084Z ##[debug]SYMSTORE: Number of files ignored = 123
It seems that it is not able to solve this puzzle yet.
I see 2 possible workarounds:
  1. Switching back to Full PDB files, as we are not running on Linux (yet) this would be a good short term solution
  2. Keeping the Portable PDB files, but include them in the NuGet packages.
We decided to go for workaround #2.

Wednesday, January 15, 2020

Rapid frontend development with GraphQL

As frontend development gets more complex, I see more and more organizations moving to a model where frontend and backend are implemented by different teams. (another nice example of reverse Conway’s law).

Most of these backends are exposed through REST APIs. This leads to the problem that frontend teams that are using the APIs have to wait for the backend team to complete the development of their APIs. The backend team becomes the bottleneck and causes slowness in development for the frontend team.

Let’s bring GraphQL into the mix…

In the GraphQL world, there is a different approach. The frontend and backend teams can develop in parallel, without stalling the development. The frontend teams can work with mock versions of the API, and use libraries like GraphQL Faker to create fake data. Coding can be completely done with mock data and tests can be written too.

I know that in theory a similar approach would be feasible with REST APIs but these APIs tend to be more static by nature. One GraphQL endpoint can handle a multitude of clients each with their own requirements. It is really hard to have the same flexibility with a REST API, most of them are either too generic(what leads to overfetching) or too specific(what leads to underfetching).

More information: https://www.howtographql.com/basics/1-graphql-is-the-better-rest/

Tuesday, January 14, 2020

Asynchronous streams - Using IAsyncEnumerable in .NET 4.7

Although IAsyncEnumerable is a part of the C# 8 release and C# 8.0 is supported on .NET Core 3.x and .NET Standard 2.1, this doesn’t have to mean that you cannot use this feature in .NET Core 2.x or the full .NET Framework.

We’ll start with the following (failing to compile) code in a .NET 4.7 project and try to make it work:

A .NET Standard 2.0 project doesn’t know the IAsyncEnumerable interface. So the first thing we need to do is to install the compatibility NuGet package Microsoft.Bcl.AsyncInterfaces.

Now the compiler finds the IAsyncEnumerable interface but Visual Studio still complains because we are targeting C# 7.3 and asynchronous streams is a C# 8 feature.

We can fix this but this requires that the following things are installed on our computer:

  • .NET Core SDK 3.0 or MSBuild Tools 2019
  • Visual Studio 2019 or VSCode

If these requirements are met we can update our project file and tell the compiler to use C# 8 as the language version:

  • Unload the project.
  • Add a <LangVersion>8</LangVersion> property to the project file.
  • Reload the project.

Monday, January 13, 2020

ElasticSearch–Pinned queries

One of the nice features of ElasticSearch is the support for Pinned Queries. With Pinned Queries you can promote a set of documents to rank higher than those matching a given query. This is a nice feature if you want to put specific results into the spotlight (for example if you want to promote a specific product or article).

In the organic part you can specify what the query is you want to execute. The Ids part will return the specified documents and put them on top of the search results.

More info: https://www.elastic.co/guide/en/elasticsearch/reference/7.4/query-dsl-pinned-query.html

Friday, January 10, 2020

MassTransit 6–Serilog integration

Before MassTransit 6, separate NuGet packages existed that allowed you to integrate the logging framework of your choice with MassTransit. In our case we were using Serilog and a MassTransit.SerilogIntegration NuGet package to bring the 2 together.

In MassTransit 6 the previous abstraction has been removed and is replaced by Microsoft.Extensions.Logging.Abstractions.

To enable integration you need to call

MassTransit.Context.LogContext.ConfigureCurrentLogContext(loggerFactory);

before configuring the bus or directly pass on the ILogger instance

MassTransit.Context.LogContext.ConfigureCurrentLogContext(logger);

Integration with Serilog can now be done through the serilog-extensions-logging NuGet package.

More information here; https://www.nuget.org/packages/MassTransit.SerilogIntegration/

Thursday, January 9, 2020

People believe in what they can control, not what works

Food for thought:

I especially liked the part about the Dunning Kruger effect and it’s impact on our industry.

Wednesday, January 8, 2020

Computer Vision repo

Computer vision is fun but can also be extremely hard to get started with. To help you, Microsoft created a Github repo where they share a lot of examples and utilities:

https://github.com/microsoft/ComputerVision-recipes

This repository provides examples and best practice guidelines for building computer vision systems. The goal of this repository is to build a comprehensive set of tools and examples that leverage recent advances in Computer Vision algorithms, neural architectures, and operationalizing such systems. Rather than creating implementions from scratch, we draw from existing state-of-the-art libraries and build additional utility around loading image data, optimizing and evaluating models, and scaling up to the cloud. In addition, having worked in this space for many years, we aim to answer common questions, point out frequently observed pitfalls, and show how to use the cloud for training and deployment.

Tuesday, January 7, 2020

ASP.NET Core–Disable Model validation when using [ApiController]

ASP.NET Core 2.1 introduced the APIController attribute which automatically enables some API features  like model validation, HTTP 400 responses, etc…

When the controller is decorated with APIController attribute, the framework would automatically register a ModelStateInvalidFilter which runs on the OnActionExecuting event.

This is nice but I had a situation where I wanted to use my own validation and didn’t want to integrate with the model binding. However I still wanted to take advantage of all the other features that the APIController adds.

This is possible through the ApiBehaviorOptions:

Monday, January 6, 2020

Azure DevOps–Migrate from TFVC to Git

There is a nice feature in Azure DevOps I wasn’t aware it existed.

You can directly migrate from TFS Version Control to Git through Azure DevOps. The import experience is great for small projects. For bigger projects it is recommended to first remove all binaries and executables from the source repo.

Here are the steps to use this feature:

  • Browse to your Azure DevOps (Server) environment
  • Select the project where you want to create the Git repository
  • Go to Repos –> Files

  • From the repo dropdown at the top, select the Import repository option.

  • On the Import repository dialog, select TFVC as the Source type.
  • Now you can specify the name of the repository / branch / folder that you want to import.
  • You can also decide if you want to migrate some of the history. You can migrate up to 180 days of history starting from the most recent changeset.

More information: https://docs.microsoft.com/nl-nl/azure/devops/repos/git/import-from-TFVC?view=azure-devops

Thanks Jef for the tip!

Friday, January 3, 2020

IIS Web Deploy issue - Microsoft.Web.Delegation.DeploymentAuthorizationException

When trying to deploy a web application through Web Deploy, it failed with the following exception message:

Microsoft.Web.Delegation.DeploymentAuthorizationException: Not able to log on the user '.\WDeployConfigWriter'

The strange thing was that it had worked before. So what had changed?

The problem is that the Web Deploy installer creates users with expiring passwords that are used to elevate permissions during deployment.

To fix it we had to change the password settings for the WDeployAdmin and WDeployConfigWriter accounts:

  • Go to Computer Management ->Local Users And Groups -> Users
  • Right click  on WDeployAdmin and choose Properties
  • Uncheck “User must changed password at next logon”
  • Check "Password never expires"
  • Click on OK.

Do the same for WDeployConfigWriter.