Skip to main content

Posts

Showing posts from February, 2021

NUnit–SetUp and TearDown methods

NUnit allows you to bootstrap individual tests through the SetUp and TearDown attributes. I typically try to keep my test classes clean and one way to do this is by moving logic to a test base class. NUnit will walk through the inheritance tree and call all Setup and TearDown methods. Setup methods (both types) are called on base classes first, then on derived classes. If any setup method throws an exception, no further setups are called. Something I noticed in the documentation : Teardown methods (again, both types) are called on derived classes first, then on the base class. The teardown methods at any level in the inheritance hierarchy will be called only if a setup method at the same level was called. The following example is illustrates the difference. So if you have a test structure like below, only the base SetUp and TearDown are called: This is a breaking change from NUnit 2.x and something to be aware of…

MassTransit–How to test consumers with dependencies

I’m using Dependency Injection in my MassTransit consumers . To test these consumers you can use MassTransit in-memory test harness; But when you try to use the code above with a consumer that has dependencies injected through the constructor, it will not work and you end up with the following error message: 'SubmitOrderConsumer' must be a non-abstract type with a public parameterless constructor in order to use it as parameter 'T' in the generic type or method 'ConsumerTestHarnessExtensions.Consumer<T>(BusTestHarness, string)' To fix this, you have to change your test code and use the consumer factory method:

Elastic App Search - JSON must be an array or hash

I’m a big fan of ElasticSearch. One of the products in the Elastic suite is App Search . I tried to get some documents through the App Search API using the following code: Unfortunately this didn’t work but resulted in the following error message: JSON must be an array or hash The reason I got this exception is because the API is expecting a JSON array or object for the id parameter. Unstructured JSON data is not permitted. You have to encapsulate your object in an array [] or a hash {} (what the error message is referring to). If you are using query parameters, ensure that ids is followed with %5D%5B - this is an array, but escaped: ?ids%5D%5B=:

NuGet - Change the location of the global package cache

One of the improvements that were introduced to NuGet a few years ago was the introduction of a global package cache. Instead of having a copy of every nuget package inside your project folder, all nuget packages are downloaded and stored once on a central location on your file system. Each package is fully expanded into a subfolder that matches the package identifier and version number. By default these packages are stored at the following location: Windows: %userprofile%\.nuget\packages Mac/Linux: ~/.nuget/packages You can override this by setting the NUGET_PACKAGES environment variable, the globalPackagesFolder or repositoryPath configuration settings (when using PackageReference and packages.config , respectively), or the RestorePackagesPath MSBuild property (MSBuild only). Remark: The environment variable takes precedence over the configuration setting.

Dapr–Ebook

With the v1 release of Dapr , it is time to investigate it and evaluate how it can help simplify your cloud-native application development. A good introduction is the following free e-book: Dapr for .NET Developers

Postman - Uploading multiple files through a multipart/form-data POST

 I had to test an ASP.NET Core API that was expecting multiple files through a multipart-formdata request. Here are the steps I had to take to get this done through Postman : Open Postman and click on the '+' sign to create a new request: Change the request type to 'POST' and enter the request URL: Click on the 'body' section and change the type to 'form-data': Enter a key name for the files you want to upload(in my case it was 'files'): Now comes the important part, go to the right border of the Key column. A dropdown appears where you can change the type from 'text' to 'file': Now the Values column changes and you get a 'Select Files' button that allows you to specify the files you want to upload: That should do the trick...

.NET Reflection - The invoked member is not supported in a dynamic assembly.

In the bootstrapping logic of my application I was using some reflection to dynamically register dependencies in the IoC container: When I was using the same code together with the .NET Core Test server the code above didn't work but resulted in the following exception: Message: System.NotSupportedException : The invoked member is not supported in a dynamic assembly. Stack Trace:      InternalAssemblyBuilder.GetExportedTypes()     Assembly.get_ExportedTypes()     <>c.<ScanForModules>b__5_0(Assembly a)     SelectManySingleSelectorIterator`2.MoveNext()     List`1.InsertRange(Int32 index, IEnumerable`1 collection)     FrameworkBuilder.ScanForModules()     <>c__DisplayClass0_0.<UseSOFACore>b__1(HostBuilderContext ctx, ContainerBuilder builder)     ConfigureContainerAdapter`1.ConfigureContainer(HostBuilderContext hostContext, Object containerBuilder)     HostBuilder.CreateServiceProvider()     HostBuilder.Build()     WebApplicationFactory`1.CreateHost(IH

Azure DevOps - My Pull Requests

I was always annoyed that I had to go through all my Azure DevOps project to find out what Pull Requests I had to review. Before this wasn't such a big issue as I was typically working on a few Azure DevOps projects at max. But since we switched to a Microservices approach where every service had its own Git repo, it started to annoy me...  Until a colleague sent me a screenshot where he had all pull requests on one screen????  If you go to the root url of your Azure DevOps organisation(https://dev.azure.com/{yourorganizationname}), you have a 'My Pull Requests tab' at your disposal: This feature existed for a long time but I never noticed it...😏

System.UnauthorizedAccessException - MemoryStream's internal buffer cannot be accessed

 When trying to read a MemoryStream I got the following error message: Here is the code I was using: Couldn't be any simpler but unfortunately it didn't work.  I couldn't figure out what I did wrong...until I noticed the following in the documentation : This constructor does not expose the underlying stream.  GetBuffer  throws  UnauthorizedAccessException . Aha, it seems that the OpenXmlPowerToolsDocument is calling GetBuffer behind the scenes. So we need to use a different constructor :  

ASP.NET Core Integration Tests - Breaking change in .NET Core 3.0

  Yesterday I blogged about how to get started with integration testing in ASP.NET Core. If you are using this technique, you might get a few issues while migrating from .NET Core 2.2 to .NET Core 3.0 or 3.1. It could be that you get the following error message: No method 'public static IHostBuilder CreateHostBuilder(string[] args)' or 'public static IWebHostBuilder CreateWebHostBuilder(string[] args)' found on 'AutoGeneratedProgram'. Alternatively, WebApplicationFactory`1 can be extended and 'CreateHostBuilder' or 'CreateWebHostBuilder' can be overridden to provide your own instance. This is due to the switch from the WebHostBuilder to the more generic HostBuilder in .NET Core 3.x.  You should use the CreateHostBuilder() method. It is still possible to access the WebHostBuilder through the ConfigureWebHostDefault() method:

ASP.NET Core Integration testing

 Integration testing your ASP.NET Core API's shouldn't be too difficult thanks to the  Microsoft.AspNetCore.Mvc.Testing   package. First add the package to your project: dotnet add package Microsoft.AspNetCore.Mvc.Testing Now you can inject the WebApplicationFactory<TEntryPoint> in your tests: You can tweak the configuration by inheriting from the WebApplicationFactory and use one of the overrides:

Saying goodbye to the Monolith - The one Entity abstraction

 In most systems I encounter there is a strong focus on "Entities". We try to model everything in this "Entity" abstraction where one "Entity" is typically mapped to one table. Every piece of data that is related to this "Entity" is encapsulated in this object. This could sound fine from a OO perspective but can bring us on a path where we get too much coupling and information from multiple domains get encapsulated in the same object. A first step in the right direction is to focus on the difference between related entities and child entities . Could some parts of the data move to a child entity or even a value object? A second step is to identity your bounded contexts and start to split your entity across the boundaries of these contexts. This means you will typically not end up with one entity but many entities each containing a subset of the data. Let's take a look at an example... Imagine that we have a Product entity. If we try to split

GraphQL Hot Chocolate - Entity Framework error: A second operation was started on this context before a previous operation completed. This is usually caused by different threads concurrently using the same instance of DbContext

 If you try to combine HotChocolate GrapQL implementation together with Entity Framework, a possible attempt could look like this: Unfortunately this doesn't work the moment multiple resolvers kick in. When you try to resolve multiple queries/fields, chances are you end up with the following exception message: "message": "A second operation was started on this context before a previous operation completed. This is usually caused by different threads concurrently using the same instance of DbContext. For more information on how to avoid threading issues with DbContext, see https://go.microsoft.com/fwlink/?linkid=2097913 .",         "stackTrace": "   at Microsoft.EntityFrameworkCore.Internal.ConcurrencyDetector.EnterCriticalSection()\r\n   at Microsoft.EntityFrameworkCore.Query.Internal.SingleQueryingEnumerable`1.AsyncEnumerator.MoveNextAsync()\r\n   at Microsoft.EntityFrameworkCore.Query.ShapedQueryCompilingExpressionVisitor.SingleOrDefaultAsync[

ASP.NET Core - Async action names

 I got into trouble when I created the following controller action in my ASP.NET Core application: This didn't work as expected but returned a 500 error stating: No route matches the supplied values.  The strange thing was that when I removed the 'async' suffix in the Action name it worked. This turned out to be a breaking change in ASP.NET Core 3.0 . Starting from ASP.NET Core 3.0, ASP.NET Core MVC removes the Async suffix from controller action names. Both routing and link generation are impacted by this new default. If you want to move back to the old behaviour, you can change it in the configuration:

XUnit - Assert.Inconclusive() alternative

 I got a question from a colleague who was used to MSTest and now was switching to XUnit. For tests that were not implemented yet, he was using the Assert.Inconclusive method in MSTest but he couldn't find a similar Assert method for XUnit. I typically use the 'Skip' property for this in XUnit:

TypeScript - Resolve JSON as a Javascript Module

 I discovered something neat that is possible in TypeScript. Did you know that you can import JSON as a Javascript module? I didn't... I had a configuration file like this: In my tsconfig.json I had to set the resolveJsonModule to true: Now I could import the config file as a module. The nice thing is that I even get type checking and autocompletion! Remark:  Not enabling this option out of the box was done by the TypeScript team to avoid pulling in large JSON files that would consume a lot of memory.

Azure DevOps - Delete an existing collection

 After a failed Azure DevOps migration, I ended up with an unusable collection on our Azure DevOps server.  I had a look at the Azure DevOps administration console but couldn't find a way to delete the collection. Luckily there is still the command line; I was able to do it through the tfsconfig tool: D:\Program Files\Azure DevOps Server 2020\Tools>tfsconfig collection /delete /collectionName:FailedCollection  Logging sent to file C:\ProgramData\Microsoft\Azure DevOps\Server Configuration\Logs\CFG_TPC_AT_0129_130947.log  TfsConfig - Azure DevOps Server Configuration Tool Copyright (c) Microsoft Corporation. All rights reserved.  Command:collection  TfsConfig -Azure DevOps Server Configuration Tool Copyright (c) Microsoft Corporation. All rights reserved.  Deleting a Team Project Collection is an irreversible operation. A deleted collection can not be reattached to the same or another Azure DevOps Server. Are you sure you want to delete 'FailedCollection'? (Yes/No) Y

Azure DevOps Server–Move collections–Offline detach

One of the basic building blocks of Azure DevOps Server is a “Project collection”. The documentation describes it like this: The primary organizational unit for all data in Azure DevOps Server. Collections determine the resources available to the projects added to them.   From a technical standpoint, a project collection is a separate database. It is possible to move project collection between different database instances but be aware that just backing up a collection on one database server and restoring it on another database server is not sufficient.  The well documented way is through a detach/attach operation as I explained last week . But another ‘hidden’ option exists through an “Offline detach”. Backup the collection and configuration database Take a backup of the collection database that you want to move using a tool of your choice and restore it on the database server used by your target Azure DevOps server. Also take a backup of the configuration database and