Skip to main content


Showing posts from January, 2021

Azure DevOps Server–Move collections–Detach/Attach

One of the basic building blocks of Azure DevOps Server is a “Project collection”. The documentation describes it like this: The primary organizational unit for all data in Azure DevOps Server. Collections determine the resources available to the projects added to them.   From a technical standpoint, a project collection is a separate database. It is possible to move project collection between different database instances but be aware that just backing up a collection on one database server and restoring it on another database server is not sufficient. Instead you should do this through a detach/attach operation: Detach a collection Before you move a collection, you need to detach it. You can do this from the Azure DevOps Administration console: When detaching, all jobs and services are stopped, and then the collection database is stopped. In addition, the detach process copies over the collection-specific data from the configuration database and saves it as part o

C# 9–Use Record types in .NET Standard 2.0

I created a small Record type in C#: However the compiler didn’t like it: Why? Because I was adding this code to a .NET Standard 2.0 library that doesn’t support C# 9 features. As a reminder, a small overview: Target framework Version C# Language Version .NET 5 C# 9.0 .NET Core 3.x C# 8.0 .NET Core 2.x C# 7.3 .NET Standard 2.1 C# 8.0 .NET Standard 2.0 C# 7.3 I didn’t want to upgrade to .NET 5 so what are my options? Turns out, that you can work around this with a small compiler hack… Let’s try it: First you need to override the language version in the csproj file: Then you should add the following code to get rid of the compiler error:

Combining Autofac and Microsoft DI in unit tests

I had to write a test where I need to inject a configuration dependency through IOptions<> .  So far I had been using Autofac in my tests, so I wouldn’t go through all the trouble to get IOptions<> working inside Autofac (especially when Microsoft DI offers the convenient AddOptions<> and Configure<> methods). Luckily it is not that hard to bring the 2 DI containers together. Here is the code I used inside my tests:

C# 9–Module Initializer

Yesterday I blogged about an issue I had with Telerik . I needed to initialize some code before I could call the PDF conversion functionality: The question is where should I put this logic? Perfect case to try out a new C# 9 feature: Module initializers . Module initializers allow you to do eager, one-time initialization when a module is loaded, with minimal overhead and without the user needing to explicitly call anything. Creating a module initializer is easy. Just add the ModuleInitializerAttribute on top of a method. Some requirements are imposed on the method targeted with this attribute: The method must be static . The method must be parameterless. The method must return void . The method must not be generic or be contained in a generic type. The method must be accessible from the containing module.

Telerik Document Processing–InvalidOperationException

When trying to convert a Word document to PDF through the Telerik Document Processing libraries I got the error message below when I tried to convert a Word document containing a high-res image: System.InvalidOperationException   HResult=0x80131509   Message=FixedExtensibilityManager.JpegImageConverter cannot be null. The .NET Standard does not define APIs for converting images or scaling their quality. In order to export images different than Jpeg and Jpeg2000 or ImageQuality different than High you will need to reference the Telerik.Documents.ImageUtils assembly/NuGet in your project and to set its basic implementation to the FixedExtensibilityManager.JpegImageConverter property or to create a custom one inheriting the JpegImageConverterBase class. For more information go to:   Source=Telerik.Documents.Fixed   StackTrace:    at Telerik.Windows.Documents.Fixed.Mode

HotChocolate– GraphQL Schema Stitching Introduction

If you are new to GraphQL Schema Stitching, I can recommend the On.NET show with Michael Staib, one of the creators of Hot Chocolate . With Schema stitching, developers can create a unified GraphQL schema from multiple underlying GraphQL APIs. This gives us the benefit of reducing multiple data queries for our data in a single request from one schema. In this episode, Jeremy chats with the author of Hot Chocolate, Michael Staib, about how .NET developers can implement GraphQL schema stitching with Hot Chocolate.

ASP.NET Core–Performance tip

Question: What is wrong with the following ASP.NET Core controller: No idea? Take a look at the return type... Returning IEnumerable from an action results in synchronous collection iteration by the serializer. The result is the blocking of calls and a potential for thread pool starvation. One possible way to avoid this is by explicitly invoking ToListAsync before returning the result: Starting from ASP.NET Core 3.0 you can also use   IAsyncEnumerable<T> . This no longer results in synchronous iteration and is as efficient as returning IEnumerable<T> . Let’s rewrite the example above to use IAsyncEnumerable<T> : REMARK: If we take a look at the original example again, it will not be an issue in ASP.NET Core 3.0 or higher. Why? Because we allow the framework to choose how to handle the IQueryable<T> we get back from the Where() clause. In the situation we are using ASP.NET Core 3.0 or higher the results will be buffered and correctly provide

ASP.NET Core–Open API–Possible parameter values

I’m currently building an ASP.NET Core API used to convert between document formats. You can upload a document, specify the from and to format and get a converted document back. The list of supported conversions is dynamically determined based on a list of format providers loaded through a plugin architecture. To help developers in getting to know the API, I created Open API documentation using Swagger . Unfortunately the generated Open API documentation wasn’t that useful out-of-the-box as you get no hint on the possible “from” and “to” values. My first idea would be to change the string value to an enum but this would require a recompile every time a new document format is introduced(and make my plugin architecture obsolete). My second idea was to just update the documentation, but in fact this was even worse than my first idea. Time to find a better solution: I created a custom ParameterFilter that dynamically generates a list of available values based on the loaded plugins

HotChocolate Schema stitching - Remove root types

When using (HotChocolate) Schema stitching by default all root query types are merged and available to use in your stitched schema: This is great as default behavior but not if you want to build up the stitched schema in a different way. To start from a clean slate you can call IgnoreRootTypes() :

ASP.NET Core - Set ContentType

Today I was looking at a way to set the correct MIME type for a response based on the file extension. What is a MIME type? A media type (also known as a Multipurpose Internet Mail Extensions or MIME type ) is a standard that indicates the nature and format of a document, file, or assortment of bytes. It is defined and standardized in IETF's RFC 6838 . How do you set the MIME type in ASP.NET Core? In ASP.NET core you can set the MIME type of the file you return through the contenttype property on the File method of the ControllerBase class: How to get the MIME type from a file extension? My first idea was to write a switch statement with the list of file extension and the corresponding MIME types.  But this turned out not be necessary as the ASP.NET Core team already provides you this functionality out-of-the-box through the FileExtensionContentTypeProvider :

WSFederation is back!

Great news for everyone who has still some WCF services running. Until recently only BasicHttpBinding and NetTcpBinding were supported . This limited the usefulness as most WCF services I encountered where using WSFederationHttpBinding (or the more recent WS2007FederationHttpBinding). These bindings were typically used in scenarios where users authenticate with a security token service (like Active Directory Federation Services). Last month, the System.ServiceModel.Federation package was released, adding client(!) support for WSFederation. Remark: You cannot use this binding yet through dotnet-svcutil or the Visual Studio’s WCF Web Service Reference Provider Tool but this will be added in the future.

NUnit - Combining multiple asserts in one test

The default rule when writing tests is ‘one test, one assert’. There are multiple reasons why this is a good idea: It becomes very clear what your test is doing and what the success or failure condition is If there are multiple asserts in your test, the test will fail on the first assert that returns false. The other assertions will never be validated. However this sometimes lead to duplication where you are writing the same test multiple times to assert different aspects. In NUnit you can avoid the second reason by using ‘ Assert.Multiple() ’. NUnit will store any failures encountered in the Multiple block and report all of them together. If multiple asserts failed, then all will be reported.

Integrating the RabbitMQ client in a .NET Core application

I couldn’t find a good example on the best way to integrate the RabbitMQ C# client in a .NET core application. So time to write a post about it. I would recommend to use the Hosted Services , either directly through the IHostedService or through the BackgroundService . Let’s see how to do it using the BackgroundService: We create a new class that inherits from BackgroundService : We’ll use the StartAsync method to create the Connectionfactory , Connection and a Channel to listen on: Remark: Notice the DispatchConsumersAsync = true in the ConnectionFactory configuration. This is important to be able to use an async consumer. If you don’t add this configuration no messages will be picked up by the AsyncEventingBasicConsumer . We also implement the StopAsync method to cleanup when our backgroundservice is shutdown: Now the only thing left is to create a consumer to start receiving messages. We’ll use the ExecuteAsync method for that: O

Find the multiplier(s) in your team

Remark: This blog post is dedicated to all the team multipliers I had the pleasure to work with over the years. A naïve way to measure the productivity of a developer is to have a look at the individual work output(let’s just ignore how we measure work output in this post, everyone agrees it is number of lines written). The reality is that most software is developed by a team where the work output of each member is a function of work output from all their teammates. Measuring individual productivity can be hard to impossible. Taking this approach would certainly leave out the most important people in your team; the team multipliers . These people may not accomplish a lot on their own, but their effect on the team’s productivity is exponential. It are those people who can make the difference between success and failure. They’ll typically don’t produce a lot of features on their own, but make the jobs of their teammates a lot easier, are always there to help, willing to do the dirty

GraphQL Hot Chocolate–Application Insights monitoring

If you use the default Application Insights configuration to monitor your GraphQL endpoint, there is not that much information you’ll get. For example, when a GraphQL request fails, it still results in a 200 response. Application Insights isn’t aware that you are using GraphQL and don’t flag this request as an exception. Let’s see how we can improve our Application Insights logging by integrating it with Hot Chocolate. Disclaimer: the code below has been written for Hot Chocolate 10 . In Hot Chocolate 11, the diagnostics system is rewritten. I’ll write an update post when I have the chance to update one of my applications to Hot Chocolate 11. First thing we need to do is to create a class that implement the IDiagnosticObserver marker interface and inject the Application Insights TelemetryClient: For every event that we want to track, we need to add a method and specify the corresponding Hot Chocolate event through the DiagnosticName attribute: We create a dicti

Visual Studio Code Coding Packs

Microsoft introduced the concept of Coding Packs for Visual Studio Code. What is a Coding Pack? Coding Packs are specialized installs of Visual Studio Code that are pre-configured for specific coding environments. They are designed to help students and educators ramp up on VS Code more quickly by automating editor and environment configuration. Coding Packs are a single download that performs the following when installed: Installs VS Code Installs a specific runtime (Python, Java, etc.) Installs essential extensions Ensures installs happen under the user scope so as not to require administrator permissions. There are especially useful to help you having the necessary tools installed on your student PC’s when giving a training. Right now a coding pack exists for Python and JAVA but I’ll hope others will be added in the future:

Configure Visual Studio 2019 to use SourceLink

Source Link allows developers to debug the source code from libraries they are using. Tools like Visual Studio can step into its source code what helps you find out what is going on behind the scenes. Let’s see how we can enable SourceLink debugging in Visual Studio: Go to Tools > Options > Debugging > Symbols and check if the  ‘ Symbol Server’ option is checked. It is recommended to specify a local directory that can be used as a symbol cache to avoid the need to download symbols every time: Disable 'Just My Code' in  Tools > Options > Debugging > General to make the debugger load symbols that are not part of your current solution. Also check that the  'Enable Source Link support' is set (which should be the case by default): When you now try to debug code, you can step into the code through F11. Visual Studio will ask you to download the corresponding source code: Troubleshooting In case you get a ‘File Dialog’ a

Using System.Diagnostic.DiagnosticSource to intercept database requests– Improvements

Yesterday I blogged about the usage of DiagnosticSource to capture database specific events. The code we ended up with looked like this: What is rather unfortunate is that you are working with generic events and that you need to use magic strings and reflection to capture the object data: A solution exists through the usage of the Microsoft.Extensions.DiagnosticAdapter package. This Nuget package will use code generation to avoid the cost of reflection. Add a reference to the Microsoft.Extensions.DiagnosticAdapter package: dotnet add package Microsoft.Extensions.DiagnosticAdapter Update the implementation to use the SubscribeWithAdapter extension method: Now we no longer need to implement the IObserver<KeyValuePair<string, object>> interface. Instead, for each event that we want to handle, we need to declare a separate method, marking it with an attribute DiagnosticNameAttribute. The parameters of these methods are the parameters of the e

Using System.Diagnostic.DiagnosticSource to intercept database requests

.NET has built-in integration with DiagnosticSource . Through this class libraries can send events and applications can subscribe to these events. Each such event contains additional information that can help you diagnose what’s going on inside your application. The great thing is that DiagnosticSource is already used by libraries like AspNetCore, EntityFrameworkCore, HttpClient, and SqlClient, what makes it possible for developers to intercept incoming / outgoing http requests, database queries, and so on. Let’s create a small example that allows us to intercept database requests coming from SqlClient: Add a reference to the System.Diagnostics.DiagnosticSource NuGet package dotnet add package System.Diagnostics.DiagnosticSource Create an Observer class that implements the IObserver<DiagnosticListener> interface. We’ll complete the implementation later. Subscribe an instance the Observer class through the DiagnosticListener.AllListeners object :

Improve the debugging experience with deterministic builds

When creating a nuget package from a .NET Core project, I noticed the following warning in the NuGet Package Explorer : Source Link (which I’ll talk about in another post) is Valid but the build is marked as ‘Non deterministic’ . What does this mean? By default builds are non-deterministic, meaning there is no guarantee that building the same code twice(on the same or different machines) will produce exactly the same binary output. Deterministic builds are important as they enable verification that the resulting binary was built from the specified source and provides traceability. How to fix it? To enable deterministic builds a property should be set to through: ContinuousIntegrationBuild . Important : This property should not be enabled during local dev as the debugger won't be able to find the local source files. Therefore, you should use your CI system's variable to set them conditionally. For Azure Pipelines,the variable is TF_BUILD can be used: &

Customize your build with Directory.Build.props and Directory.Build.targets

With the release of MSBuild version 15, you no longer need to add custom properties to every project in your solution. Instead you can create a single file called Directory.Build.props in the root folder of your source.When MSBuild runs, it will search for this file and loads it when found. I use this on my projects to set common properties: Directory.Build.props is imported very early in the build pipeline and can be overridden elsewhere in the project file or in imported files. If you want to override properties set in a specific project, use Directory.Build.targets instead. It is used and discovered in the same way, but it is loaded much later in the build pipeline. So, it can override properties and targets defined in specific projects. More information: