Skip to main content

Posts

Showing posts from December, 2018

ASP.NET Error - The specified task executable "csc.exe" could not be run.

When trying to build an ASP.NET Web application from a colleague, it failed with the following error message: The specified task executable "csc.exe" could not be run. Could not load file or assembly 'System.Security.Principal.Windows, Version=4.0.1.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. I was able to fix the problem by updating the Microsoft.CodeDom.Providers.DotNetCompilerPlatform nuget package to the latest version:   <package id="Microsoft.CodeDom.Providers.DotNetCompilerPlatform" version=" 2.0.1 " targetFramework="net471" />

Adding GraphQL middleware when using ASP.NET Core

Enabling a GraphQL endpoint in your ASP.NET Core application is quite easy thanks to the GraphQL.Server.Transports.AspNetCore NuGet package. One of the nice features of GraphQL is that you can extend your resolvers using custom middleware. In the GraphQL.NET documentation they refer to the FieldsMiddleware property to register this extra middleware: Unfortunately when using the GraphQL.Server package you only have an GraphQLOptions object but the only thing you can do is set a SetFieldMiddleware flag: To register your own middleware you have to jump through some extra hoops: Create a custom GraphQLExecutor and override the GetOptions method: Register this Executor instance in your IoC container(here I'm using StructureMap):

ASP.NET Web API–The inner handler has not been assigned

I created the following messagehandler to add the correlation id to an outgoing HTTP request: and used the following code to link it to the HttpClient: Looked OK to me until I tried to execute this code. This failed horribly with the following error message:   "message": "An error has occurred.",   "exceptionMessage": "The inner handler has not been assigned.",   "exceptionType": "System.InvalidOperationException",   "stackTrace": "   at System.Net.Http.DelegatingHandler.SetOperationStarted()    at System.Net.Http.DelegatingHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)    at CorrelationId.AddCorrelationIdToRequestHandler.SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) What was not obvious to me is that you have to specify the next handler in the request pipeline. As I only had one handler added, I had to specify the HttpCl

Angular–Output enum as string in your component html

TypeScript supports 2 types of enums: Numeric enums String enums From the documentation : Numeric enums An enum can be defined using the enum keyword. enum Direction { Up = 1, Down, Left, Right, } Above, we have a numeric enum where Up is initialized with 1 . All of the following members are auto-incremented from that point on. In other words, Direction.Up has the value 1 , Down has 2 , Left has 3 , and Right has 4 . Numeric enums can be mixed in computed and constant members (see below) . String enums String enums are a similar concept, but have some subtle runtime differences as documented below. In a string enum, each member has to be constant-initialized with a string literal, or with another string enum member. enum Direction { Up = "UP", Down = "DOWN", Left = "LEFT", Right = "RIGHT", } While string enums don’t have auto-incrementing behavior,

Azure DevOps Server - [System.String[]] doesn't contain a method named 'Trim'

When trying to release an application using Azure DevOps Pipelines, one of the tasks failed with the following error message: 2018-12-18T14:02:03.6775235Z Deployment status for machine DEVELOPMENT : Failed 2018-12-18T14:02:03.6941240Z Deployment failed on machine DEVELOPMENT with following message : System.Exception: Method invocation failed because [System.String[]] doesn't contain a method named 'Trim'. 2018-12-18T14:02:03.7087715Z ##[error]] doesn't contain a method named 'Trim'."}};] 2018-12-18T14:02:03.9362960Z ##[error]Deployment on one or more machines failed. System.Exception: Method invocation failed because [System.String[]] doesn't contain a method named 'Trim'. This specific task tries to do a deployment of the application using a remote execution of powershell on the target machine. The problem is that an outdated Powershell version was still running on this machine. I checked the Powershell version using the f

"NHibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session"

I’m a big fan of NHibernate(if I didn’t mention this before). The only problem is that it’s error messages can be kind of cryptic, especially if you have limited knowledge about NHibernate. A colleague came to me with the following problem: He tried to update a detached object that he got back from an API call and constructed himself, NHibernate refused to save it(using the Update method) and returned the following error message: "NHibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session" The problem was that a similar object(using the same identifier) was already loaded and tracked by the session. When he tried to associate the detached object to the session, NHibernate recognizes a similar object with the same identifier and raises the error above. To fix this, you have to switch from Update() to Merge(). This method copies the state of the given object onto the persistent object with the same

Free ebook - Pattern Recognition and Machine Learning

I’m investing a lot of time to at least get a little bit of understanding about machine learning. Luckily there is a lot of (free) information out there. Christopher Bishop, Technical Fellow and Laboratory Director In Microsoft Research Cambridge, shared his book Pattern Recognition and Machine Learning . This leading textbook provides a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners. No previous knowledge of pattern recognition or machine learning concepts is assumed. (—> Exactly what I need ) Much (machine) learning fun!

Feature ‘default literal’ is not available in C# 7.0.

On one of my projects we are ‘default literals’, one of the features introduced in C#7.1.  To be able to use this, you have to change your project properties to point to the latest minor version(more information here: https://bartwullems.blogspot.com/2017/09/how-to-start-using-c-71.html ). Unfortunately on the build server it didn’t work as expected, but got the following error in our build logs: error CS8107: Feature ‘default literal’ is not available in C# 7.0. Please use language version 7.1 or greater. The strange this was that on the same build server,  another build using the same feature did succeed. So it couldn’t be linked to the possibility that C# 7.1 was not installed on the build server. Then I got some inspiration, maybe it was related to the Build Configuration. And indeed when I switched from Debug to Release the problem appeared in my Visual Studio as well: When enabling C# 7.1 for this project, a <langversion> is introduced in the csproj file:

Entity Framework–PostgreSQL–Enable spatial type support–Part 2

In a previous post I explained on how to enable spatial type support for your Entity Framework (core) model. If you added the HasPostgresExtension("postgis") check, you get the following error message when you try to execute the Entity Framework migration and the extension is not installed: could not open extension control file "/usr/share/postgresql/9.5/extension/postgis.control": No such file or directory To install the extension you can follow the instructions on the postgis website: https://postgis.net/install/ . The installation differs based on the host platform you are using for your PostgreSQL instance. After installation has completed you can activate the extension for you database using the following statement: -- Enable PostGIS (includes raster) CREATE EXTENSION postgis;

Entity Framework–PostgreSQL–Enable spatial type support

To enable Spatial Type support for PostgreSQL you have to do an extra step when configuring your DBContext: Calling the UseNetTopologySuite() method activates a plugin for the Npgsql EF Core provider which enables mapping NetTopologySuite's types to PostGIS columns, and even translate many useful spatial operations to SQL. This is the recommended way to interact with spatial types in Npgsql. To check if the PostGIS extension is installed in your database, you can add the following to your DbContext:

SQL Server–Document your database using Extended Properties

On one of my projects we had to re-use an existing table. Unfortunately the table wasn’t documented very well(read; not at all). To make it even worse some of the columns were used for different things than you would expect based on the column names. The table even contained some implicit logic about the way the data was inserted into the table. To avoid this horror again,  we decided to apply the boy scout rule and leave the camp place in a better state. We started by adding documentation to changed or newly added columns using the SQL Server Extended Properties feature. You can add Extended Properties either through SQL Server Management Studio or through the sp_addextendedproperty stored procedure: exec sp_addextendedproperty @name = N'Price' ,@value = N'Testing entry for Extended Property' ,@level0type = N'Schema', @level0name = 'northwind' ,@level1type = N'Table', @level1name = 'Product' ,@level2type

Enabling row compression in SQL Server

Enabling row and page compression can give you a big performance gain in SQL Server. IO remains expensive especially when your SQL Server is still using spinning disks. By enabling row (and page) compression you can decrease the amount of storage needed on disk a lot. How to enable row compression? We’ll start by estimating the space savings for row compression by executing the following stored procedure: EXEC sp_estimate_data_compression_savings 'vervoer', 'MAD', NULL, NULL, 'ROW' ; GO Here are the results we get back: By comparing the column size_with_requested_compression_setting(KB) and dividing by the column size_with_current_compression_setting(KB), you can save over 50%. Sounds good enough for me, let’s enable this: ALTER TABLE vervoer.MAD REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = ROW) GO

NUnit–TestCase vs TestCaseSource

NUnit supports parameterized tests through the TestCase attribute. This allows you to specify multiple sets of arguments and will create multiple tests behind the scenes: However the kind of information that you can pass on through an attribute is rather limited. If you want to pass on complex objects you need a different solution. In that case (no pun intended) you can use the TestCaseSource attribute: TestCaseSourceAttribute is used on a parameterized test method to identify the source from which the required arguments will be provided. The attribute additionally identifies the method as a test method. The data is kept separate from the test itself and may be used by multiple test methods. An example:

Azure DevOps–Where can I see the available hosted pipeline minutes?

Last week I got a question from one of our internal teams because they got the following error message when trying to execute a build: Your account has no free minutes remaining. Add a hosted pipeline to run more builds or releases. Until recently we were using private build agents, but we decided to switch completely to hosted builds. Checking the remaining build minutes Here are the steps to check the remaining build minutes: Open the Azure DevOps site of your organisation Click on Organization settings in the left corner Click On Retention and parallel jobs Switch to the Parallel jobs tab Here you can see the available job pipelines and the amount of minutes remaining(if you are using the free tier) To purchase parallel jobs, you can click on the available link. You will be redirect to the Visual Studio marketplace: https://marketplace.visualstudio.com/items?itemName=ms.build-release-hosted-pipelines Clicking on the Get

FluentValidation–Conditional Validation Rule

FluentValidation is really powerful, but this power also makes it sometimes complex to find the correct way to solve a specific problem. In this case I wanted to conditionally execute a certain validation when data in another field was filled in. FluentValidation makes this possible through the usage of the When/Unless methods. Here is a short example: We only validate the DeliveryDate when FastDelivery is choosen.

Learning F# through FSharpKoans

In my journey to become a better F# (and C#) developer, I found the following project on GitHub: https://github.com/ChrisMarinos/FSharpKoans From the documentation: Inspired by EdgeCase's fantastic Ruby koans , the goal of the F# koans is to teach you F# through testing. When you first run the koans, you'll be presented with a runtime error and a stack trace indicating where the error occured. Your goal is to make the error go away. As you fix each error, you should learn something about the F# language and functional programming in general. Your journey towards F# enlightenment starts in the AboutAsserts.fs file. These koans will be very simple, so don't overthink them! As you progress through more koans, more and more F# syntax will be introduced which will allow you to solve more complicated problems and use more advanced techniques. To get started clone the project and open it in Visual Studio(Code). Browse to the FSharpKoans project and run ‘dotnet wa

ASP.NET Core SignalR–Add a Redis backplane

The moment that you start using SignalR, better sooner than late you should add a backplane to allow scaling out your backend services. At the moment of writing I’m aware of only 2 possible backplanes: The Azure SignalR service: in this case you are moving your full SignalR backend logic to Azure. Azure will manage the scale out for you. Redis backplane: in this case you are still running the SignalR backend yourself and only the data is replicated through Redis In their official documentation Microsoft refers to the Microsoft.AspNetCore.SignalR.StackExchangeRedis nuget package, but I’m using the (official?) Microsoft.AspNetCore.SignalR.Redis package. Here are the steps you need to take to use it: Add the Microsoft.AspNetCore.SignalR.Redis package to your ASP.NET Core project. In your startup class add the AddRedis line to your SignalR middleware configuration: Add the Redis connectionstring to your configuration. That’s all! Remark: Don’t for

ASP.NET Web API–Mediatype exception on POST

When trying to send some JSON data through an HTTP POST, I got the following exception message back: The request entity's media type 'text/plain' is not supported for this resource. No MediaTypeFormatter is available to read an object of type 'TransportDocument' from content with media type 'text/plain' This error showed up because I forgot to specify the content type in the HTTP headers. Here is one way how to fix this: