Friday, March 29, 2019

ASP.NET 4.5–System.Web.Http is missing

After upgrading a project to ASP.NET 4.5, a warning appeared next to the System.Web.Http reference. Visual Studio could not find the related DLL anymore.

Starting from Visual Studio 2017, this DLL is no longer available in the GAC but should be added through NuGet.

You can find the corrent assembly in the Web Api Core NuGet package.

Remark: Don’t be confused by the Core in the name. This package is not meant for ASP.NET Core.

Thursday, March 28, 2019

ASP.NET Core 2.2 - SignalR Breaking changes

After a seamlessly flawless upgrade to ASP.NET Core, we got into trouble when using SignalR hubs.

In our code we were rather lazy and had allowed CORS for everything:

In ASP.NET Core 2.2 it is no longer allowed to combine AllowAnyOrigin and AllowCredentials.  When you try to use the two together, you‘ll see the following warning in the output window:

warn: Microsoft.AspNetCore.Cors.Infrastructure.CorsService

The CORS protocol does not allow specifying a wildcard (any) origin and credentials at the same time. Configure the policy by listing individual origins if credentials needs to be supported.

Problem is that this warning breaks the preflight OPTIONS check of SignalR and the client can no longer connect to the hubs.

To fix this, you have to remove the AllowAnyOrigin method and instead specify the origins explicitly(like it should be in the first place):

Wednesday, March 27, 2019

.NET Core–Unit Tests Configuration

Inside my (.NET Core) unit tests I wanted to load my configuration. I created a small helper method that loads my configuration from a json file:

To make this code compile, you have to add 2 NuGet packages to your test project:

Now you can invoke this method inside your test setup (I’m using NUnit so I use the OneTimeSetUp method) and pass on the TestDirectory (which is through TestContext.CurrentContext.TestDirectory for NUnit):

Remark: Don’t forget to set the appsettings.json file properties to ‘Copy Always’

    Tuesday, March 26, 2019

    Entity Framework Core–Configure Warnings

    Some posts ago I complained about my experience when using EF Core in combination with PostgreSQL.  One of the things I didn’t liked was that when EF Core couldn’t translate a LINQ query, it silently felled back to client side evaluation, and executed the query on the in-memory collection.

    This turns out not be specifically related to the EF Core implementation for PostgreSQL, but is in fact general behavior of .NET Core. I only didn’t noticed that the same thing happened with SQL Server as the EF Core provider for SQL Server is a lot smarter and can translate more LINQ queries correctly.

    I said the following in the previous post:

    The only way to discover this is through the logs where you see a warning when the driver couldn’t translate your LINQ statement. If you didn’t check the logs, you aren’t even aware of the issue.

    Yesterday I discovered that you can change this behavior.  Therefor I had to call the ConfigureWarnings() and set the following option:

    Now every time the driver falls back to client side evaluation of a query, it will throw an error.

    Remark: In EF Core 3 the default behavior will change and EF will throw an error any time a LINQ expression results in client side evaluation. You will then have the option to allow those client side evaluations.

    Monday, March 25, 2019

    PostgreSQL performance monitoring using pg_stat_statements

    To monitor the performance of one of our applications I asked to activate the pg_stat_statements module on PostgreSQL.

    From the documentation:

    The pg_stat_statements module provides a means for tracking execution statistics of all SQL statements executed by a server.

    The module must be loaded by adding pg_stat_statements to shared_preload_libraries in postgresql.conf, because it requires additional shared memory. This means that a server restart is needed to add or remove the module.

    So I asked the admins to enable the module and restart the server. Once I got confirmation that the restart was done I tried to call the stored procedure:

    SELECT *
    FROM
      pg_stat_statements
    ORDER BY
      total_time DESC;
    Unfortunately this resulted in an error message:
    Query failed: ERROR: relation "pg_stat_statements" does not exist
    Let’s check if the module is indeed installed:

    SELECT * FROM pg_available_extensions

    That seems OK. It turns out that you need to take one extra step before you can use this module. You have to execute following command:
    CREATE EXTENSION pg_stat_statements; 
    That’s it!

    Friday, March 22, 2019

    Docker–Getting a SQL Server(Express) instance up and running

    To run some local tests, I needed a database. I didn’t want to spend my time installing a local SQL Server (Express) instance.

    So Docker to the rescue! Using SQL Server Express is easy thanks to the mssql-server-linux image.

    To create an instance:

    docker run -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=Passw0rd!123" -e "MSSQL_PID=Express" -p 1433:1433 -d --name=sqlexpress microsoft/mssql-server-linux:latest
    And if you want to save the data when the container is shutdown, you can create an instance using a mounted volume:
    docker run -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=Passw0rd!123" -e "MSSQL_PID=Express" -p 1433:1433 -d --name=sqlexpress -v //d/data/sql:/var/opt/mssql/data microsoft/mssql-server-linux:latest
    Can’t imagine my life anymore without Docker…

    Thursday, March 21, 2019

    Developers, you don’t need to ask permission…

    Reading the “The programmer as decision maker” by Mark Seemann hit a nerve with me. You didn’t read it yet? Ok, go do that first….

    …. Done reading? Ok, let’s add my 2 cents on it. I completely agree with the statement that Mark tries to make. Every day in a programmer’s life is combination of micro- and macro decisions. Unfortunately programmer’s are not always aware that they are making these decisions giving them the feeling there are not in control. This is really sad especially if you know that the 3 things that motivate us the most are ‘Autonomy’, ‘Mastery’ and ‘Purpose’. We as developers have the luxury that we can quite easily achieve all 3 but we are not aware that these things are in our control…

    I sometimes here statements like; “Everything is decided for me”, “I’m not involved in the decision making”, “Our PM will never allow us to do X” where in fact these people are already making decisions(good and bad) all the time.
    So I would say to all developers out there; “Embrace the power that you have and stop thinking you need to ask permission…”



    And of course I have to repeat the Grace Hopper quote:
    "It's easier to ask forgiveness than it is to get permission."
    Grace Hopper
    PS: If anyone from my team is reading this, don’t forget to ask my permission first before you start applying this 😏

    Wednesday, March 20, 2019

    NHibernate–PostgreSQL - Dialect does not support DbType.Guid Parameter name: typecode

    When trying to create a new ISession instance, NHibernate throwed the following error message:

    Dialect does not support DbType.Guid Parameter name: typecode

    at NHibernate.Dialect.TypeNames.Get(DbType typecode)
    at NHibernate.Mapping.Table.SqlTemporaryTableCreateString(Dialect dialect, IMapping mapping)
    at NHibernate.Mapping.PersistentClass.PrepareTemporaryTables(IMapping mapping, Dialect dialect)
    at NHibernate.Impl.SessionFactoryImpl..ctor(Configuration cfg, IMapping mapping, Settings settings, EventListeners listeners)
    at NHibernate.Cfg.Configuration.BuildSessionFactory()
    at FluentNHibernate.Cfg.FluentConfiguration.BuildSessionFactory()

    As you can probably guess from the title, we are using PostgreSQL and it turns out that the default used Dialect doesn’t support Guids. Luckily this was fixed a long time ago and the only thing I had to do was specify explicitly a higher Dialect version:

    Tuesday, March 19, 2019

    NHibernate–PostgreSQL–Naming strategy

    In a previous post I mentioned we were using PostgreSQL together with Entity Framework Core. One of the things I stumbled over when trying to use NHibernate instead was that Entity Framework Core uses Pascal Casing to generate the names of Tables, Columns and queries whereas NHibernate uses lowercasing(which is in fact more suitable for PostgreSQL, but that is another discussion).

    One of the reasons I like NHibernate so much is that it is fully customizable and that we can easily change the naming strategy to take the EF Core conventions into account.

    • Start by creating a custom naming strategy:
    • As a second step, register this naming strategy on your NHibernate configuration object(in this example through Fluent NHibernate):

    Monday, March 18, 2019

    My impressions when using Entity Framework Core with PostgreSQL

    On one of my projects we are using Entity Framework Core. Although my experience on combining Entity Framework Core and SQL Server was quite OK, I couldn’t say the same thing when using the Entity Framework Core drivers for PostgreSQL(https://www.npgsql.org/efcore/index.html).

    Although you get the impression that everything works, the moment you have a look at the logs you see 2 obvious issues:

    1. For a lot of LINQ statements the driver cannot correctly translate it to a working SQL statement. So what happens is that everything is loaded in memory and the expression is executed on the in-memory collection. The only way to discover this is through the logs where you see a warning when the driver couldn’t translate your LINQ statement. If you didn’t check the logs, you aren’t even aware of the issue.
    2. The generated SQL statements are far from optimal. For example every time you add an include an extra correlated subquery is generated which turns out to be quite slow on PostgreSQL.

    In our situation I can add another issue on the list as we were using some GIS features through NetTopologySuite. Problem there was that our (simple?) use case wasn’t supported. In our use case we have a set of buildings stored in our database. Every building has a location and we wanted to return all buildings within a certain bounding box to show them on a map. We couldn’t get this working through the LINQ integration and had to fallback to raw SQL.

    I’m happy to see that the driver is evolving but I thought it would be a good time to give NHibernate a try. (But that is for another blog post).

    PS: I also don’t like that the EF Core driver uses Pascal casing to generate the database, columns, … This is really annoying as PostgreSQL is case sensitive and requires that you escape all names when they aren’t lowercase…

    Friday, March 15, 2019

    Azure DevOps - Create a prerelease nuget package using versionsuffix

    Creating a prerelease package is not that hard. The dotnet core task allows you to invoke the pack command and gives you the option to use Automatic package versioning:

    This will give you 3 options to expand your package version:

    • 'Use the date and time':  this will generate a SemVer -compliant version formatted as X.Y.Z-ci-datetime where you choose X, Y, and Z.
    • 'Use an environment variable'; this allows you to select an environment variable
    • 'Use the build number', this will use the build number to version your package.

    If neither of these options is OK for you, know that there exists another alternative:

    • By adding a VersionSuffix build property you can specify any prerelease value you want.

    Thursday, March 14, 2019

    Azure Data Studio–PostgreSQL support

    With the latest release of Azure Data Studio, a new (preview) extension was introduced that supports PostgreSQL.

    Enabling the extension

    • Open up Azure Data Studio.
    • Click on the Extension tab on the left.
    • Click in the Search Extensions in Marketplace textbox and start typing ‘PostgreSQL’. You should find the PostgreSQL extension from Microsoft.

    • Select he Extension and click Install.

    Wednesday, March 13, 2019

    .NET Core - The nuget command failed with exit code(1) and error(System.InvalidCastException: Unable to cast object of type 'System.String' to type 'NuGet.Frameworks.NuGetFramework'.

    After creating a .NET Core application, I configured our build pipeline to Restore the NuGet packages, Build the application and publish it. Unfortunately the build failed almost immediately. 

    The build specifically failed on the Restore NuGet packages with the following error message:

    The nuget command failed with exit code(1) and error(System.InvalidCastException: Unable to cast object of type 'System.String' to type 'NuGet.Frameworks.NuGetFramework'.

    Here are the related build logs:

    2019-03-15T15:58:27.7657780Z Attempting to pack file: D:\b\4\agent\_work\83\s\SOFACore\SOFACore.Logging.Exceptions\SOFACore.Logging.Exceptions.csproj

    2019-03-15T15:58:27.7706605Z [command]D:\b\4\agent\_work\_tool\NuGet\4.1.0\x64\nuget.exe pack D:\b\4\agent\_work\83\s\SOFACore\SOFACore.Logging.Exceptions\SOFACore.Logging.Exceptions.csproj -NonInteractive -OutputDirectory D:\b\4\agent\_work\83\a -Properties Configuration=Release -version 1.0.0-CI-20190315-165827 -Verbosity Detailed

    2019-03-15T15:58:29.7910390Z System.InvalidCastException: Unable to cast object of type 'System.String' to type 'NuGet.Frameworks.NuGetFramework'.

    2019-03-15T15:58:29.7910390Z    at NuGet.ProjectManagement.NuGetProject.GetMetadata[T](String key)

    2019-03-15T15:58:29.7920155Z NuGet Version: 4.1.0.2450

    2019-03-15T15:58:29.7920155Z    at NuGet.ProjectManagement.PackagesConfigNuGetProject..ctor(String folderPath, Dictionary`2 metadata)

    2019-03-15T15:58:29.7920155Z Attempting to build package from ‘SOFACore.Logging.Exceptions.csproj'.

    2019-03-15T15:58:29.7920155Z    at CallSite.Target(Closure , CallSite , Type , Object , Dictionary`2 )

    2019-03-15T15:58:29.7920155Z MSBuild auto-detection: using msbuild version '15.9.21.664' from 'D:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\MSBuild\15.0\bin'. Use option -MSBuildVersion to force nuget to use a specific version of MSBuild.

    2019-03-15T15:58:29.7920155Z    at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)

    2019-03-15T15:58:29.7920155Z    at NuGet.CommandLine.ProjectFactory.AddDependencies(Dictionary`2 packagesAndDependencies)

    2019-03-15T15:58:29.7920155Z    at NuGet.CommandLine.ProjectFactory.ProcessDependencies(PackageBuilder builder)

    2019-03-15T15:58:29.7929920Z    at NuGet.CommandLine.ProjectFactory.CreateBuilder(String basePath, NuGetVersion version, String suffix, Boolean buildIfNeeded, PackageBuilder builder)

    2019-03-15T15:58:29.7929920Z    at NuGet.Commands.PackCommandRunner.BuildFromProjectFile(String path)

    2019-03-15T15:58:29.7929920Z    at NuGet.CommandLine.PackCommand.ExecuteCommand()

    2019-03-15T15:58:29.7929920Z    at NuGet.CommandLine.Command.ExecuteCommandAsync()

    2019-03-15T15:58:29.7929920Z    at NuGet.CommandLine.Command.Execute()

    2019-03-15T15:58:29.7929920Z    at NuGet.CommandLine.Program.MainCore(String workingDirectory, String[] args)

    2019-03-15T15:58:29.7929920Z Packing files from 'D:\b\4\agent\_work\83\s\SOFACore\VLM.SOFACore.Logging.Exceptions\bin\Release\netstandard2.0'.

    2019-03-15T15:58:29.7929920Z Add file 'D:\b\4\agent\_work\83\s\SOFACore\VLM.SOFACore.Logging.Exceptions\bin\Release\netstandard2.0\VLM.SOFACore.Logging.Exceptions.dll' to package as 'lib\netstandard2.0\VLM.SOFACore.Logging.Exceptions.dll'

    2019-03-15T15:58:29.8379110Z ##[error]The nuget command failed with exit code(1) and error(System.InvalidCastException: Unable to cast object of type 'System.String' to type 'NuGet.Frameworks.NuGetFramework'.

       at NuGet.ProjectManagement.NuGetProject.GetMetadata[T](String key)

       at NuGet.ProjectManagement.PackagesConfigNuGetProject..ctor(String folderPath, Dictionary`2 metadata)

       at CallSite.Target(Closure , CallSite , Type , Object , Dictionary`2 )

       at System.Dynamic.UpdateDelegates.UpdateAndExecute3[T0,T1,T2,TRet](CallSite site, T0 arg0, T1 arg1, T2 arg2)

       at NuGet.CommandLine.ProjectFactory.AddDependencies(Dictionary`2 packagesAndDependencies)

       at NuGet.CommandLine.ProjectFactory.ProcessDependencies(PackageBuilder builder)

       at NuGet.CommandLine.ProjectFactory.CreateBuilder(String basePath, NuGetVersion version, String suffix, Boolean buildIfNeeded, PackageBuilder builder)

       at NuGet.Commands.PackCommandRunner.BuildFromProjectFile(String path)

       at NuGet.CommandLine.PackCommand.ExecuteCommand()

       at NuGet.CommandLine.Command.ExecuteCommandAsync()

       at NuGet.CommandLine.Command.Execute()

       at NuGet.CommandLine.Program.MainCore(String workingDirectory, String[] args))

    2019-03-15T15:58:29.8388875Z ##[error]An error ocurred while trying to pack the files.

    2019-03-15T15:58:29.8388875Z ##[section]Async Command Start: Telemetry

    2019-03-15T15:58:29.8388875Z ##[section]Async Command End: Telemetry

    2019-03-15T15:58:29.8398640Z ##[section]Async Command Start: Telemetry

    2019-03-15T15:58:29.8398640Z ##[section]Async Command End: Telemetry

    2019-03-15T15:58:29.8398640Z ##[section]Finishing: NuGet pack

    So what is the problem?

    This seems to be a known problem;https://github.com/NuGet/Home/issues/4808. The NuGet 4.0 pack command cannot handle .NET Core(and .NET Standard) projects.

    There are 2 possible solutions:

    1. Switch to the .NET Core build task and invoke the dotnet restore command.
    2. Use the NuGet Tools installer build task to install a newer version of Nuget exe (NuGet 4.9.4 seems to work)

    Tuesday, March 12, 2019

    Azure DevOps (Server) - Exempt from policy enforcement

    Last week a colleague asked me to explain the concept of Pull Requests using our Azure DevOps Server instance. I followed the steps as described here to active a branch policy on their master branch. Then I did a local change, committed it and tried to push it to the master branch, but instead of getting an error stating that I couldn’t push to this branch, it succeeded! I tried it a few times and every time I was able to push no matter what policies I activated.

    Was it a bug? Or did I do something wrong? Turned out it was neither.

    Let’s find out what happened:

    • Go to the Code –> Branches section inside Azure DevOps(TFS).

    • Click on the 3 dots … and choose Branch Security.

    • On the Security screen, I clicked on my user account and had a look at the permissions on the right.

    • I had the Exempt from policy enforcement permission enabled. This overrules are policies and explained why I was able to push my changes.

    Some extra notes from the documentation:

    There are several permissions that allow users to bypass branch policy. In TFS 2015 through TFS 2018 Update 2, the Exempt from policy enforcement permission allows users with this permission to perform the following actions:

    • When completing a pull request, opt-in to override policies and complete a pull request even if the current set of branch policies is not satisfied.
    • Push directly to a branch even if that branch has branch policies set. Note that when a user with this permission makes a push that would override branch policy, the push automatically bypasses branch policy with no opt-in step or warning.

    In Azure DevOps Services, the Exempt from policy enforcement permission is removed and its functionality divided into the following two new permissions:

    • Bypass policies when completing pull requests
    • Bypass policies when pushing

    Users that previously had Exempt from policy enforcement enabled now have the two new permissions enabled instead.

    Friday, March 8, 2019

    ASP.NET Core Performant logging

    One of the hidden features of ASP.NET Core is the support for LoggerMessage.

    From the documentation:

    LoggerMessage features create cacheable delegates that require fewer object allocations and reduced computational overhead compared to logger extension methods, such as LogInformation, LogDebug, and LogError. For high-performance logging scenarios, use the LoggerMessage pattern.

    LoggerMessage provides the following performance advantages over Logger extension methods:

    • Logger extension methods require "boxing" (converting) value types, such as int, into object. The LoggerMessage pattern avoids boxing by using static Action fields and extension methods with strongly-typed parameters.
    • Logger extension methods must parse the message template (named format string) every time a log message is written. LoggerMessage only requires parsing a template once when the message is defined.

    The best way to use it is through some extension methods on the ILogger interface:

    • Create a static LoggerExtensions class
    • Add a static constructor and use the Define(LogLevel, EventId, String) method to create an Action delegate for logging a message. For the Action, specify:
      • The log level.
      • A unique event identifier (EventId) with the name of the static extension method.
      • The message template (named format string).
    • Store the created Action delegate in a private static field
    • Create an extension method that invokes the delegate

    Thursday, March 7, 2019

    Angular–Global Error Handling

    By default, Angular comes with its own ErrorHandler. This error handler captures all errors inside our angular app and prevents the app from crashing.

    If you want to implement your own error handling logic, this would be a good place to start. Therefore we have to create a new class that inherits from ErrorHandler:

    Inside this ErrorHandler we use our own ErrorService:

    This ErrorService exposes an observable that can be used to subscribe to errors in different places.

    The ErrorHandler is registered inside our AppModule:

    Wednesday, March 6, 2019

    Serilog–Filter expressions

    One of the lesser known features inside Serilog is the support for filter expressions. This gives you a SQL like syntax to filter your log messages.

    To enable this feature you have to install the Serilog.Filters.Expressions nuget package.

    More information: https://nblumhardt.com/2017/01/serilog-filtering-dsl/

    Tuesday, March 5, 2019

    Serilog–Sub loggers

    One of the lesser known features inside Serilog is the support for sub loggers. This allows you to redirect (part of) the log traffic to different sinks based on certain conditions.

    To create and use a sub logger you have to use the WriteTo.Logger() method. On this method you can create a whole new logger element with its own enrichers, filters and sinks:

    In this example all log data with a warning level(including coming from Microsoft) is written to warning.txt and all Microsoft related data is logged to microsoft.txt.

    More information here: https://nblumhardt.com/2016/07/serilog-2-write-to-logger/

    Monday, March 4, 2019

    Serilog–Code Analyzer

    With the introduction of Roslyn as the compiler platform in Visual Studio, we got support for Roslyn analyzers. If you never heard about it, read this great introduction here; https://andrewlock.net/creating-a-roslyn-analyzer-in-visual-studio-2017/.

    Although creating your own analyzer is not that easy, using them is. And there are a lot of situations where an analyzer can prevent you from making some stupid mistakes.

    One example I liked was when using Serilog. Serilog is a structured logging framework where your messages are logged using message templates. Parameters used inside these message templates are serialized and stored as separate properties on the log event giving you great flexibility in searching and filtering through log data.

    Here is an example from the Serilog website:

    The Position and the Elapsed properties are stored separately from the message.

    Problem is that you can easily make a mistake, although the message template expects 2 parameters it is possible to provide only one. This is a perfect situation where a Roslyn analyzer can help.

    And the good news is that such an analyzer already exists:https://github.com/Suchiman/SerilogAnalyzer. This analyzer checks for the situation above and a lot of other small mistakes you can make when using the Serilog library.

    Friday, March 1, 2019

    SQL Server–Row vs Page compression

    Row compression is one of these performance tricks that can have a big impact on your SQL Server database. I got a questioj from a colleague about the difference with page compression. So here is the (short) answer:

    Row compression is a subset of page compression. This means that when page compression is enabled row compression is active as well.

    So what is row compression adding to the table(no pun intended)?

    Page compression will combine in the following order:

    • Row compression
    • Prefix compression
    • Dictionary compression

    The difference between those is well explained in the SQL Server documentation: https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2012/cc280464(v=sql.110)

    If you are in doubt if you should enable page compression or not, the following article will give you some guidance: https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2008/dd894051(v=sql.100)