Skip to main content

Posts

Showing posts from 2020

ASP.NET Core–Use dynamic proxy with standard dependency injection - Async

Yesterday I blogged about using Castle.DynamicProxy to generate a proxy class and use AOP techniques for caching. The code I showed worked for synchronous method calls but fails when you want to proxy async method calls. Let’s see how we can get this working for async… Here is the async version of our repository: We need an extra NuGet package: dotnet add Castle.Core.AsyncInterceptor Now we have to change our interceptor to call a second async interceptor: Our extension method that triggers the proxy creation remains the same: Our registration code should be extended to register the async interceptor as well:

ASP.NET Core–Use dynamic proxy with standard dependency injection

You don’t need to use a 3th party IoC container to use AOP(Aspect Oriented Programming) and the proxy pattern in ASP.NET Core. We’ll combine the power of Castle.DynamicProxy and the standard DI to make this possible. Castle's dynamic proxies allows you to create proxies of abstract classes, interfaces and classes (only for virtual methods/properties). Let’s create an example that caches the output of a repository call. Here is the example repository that we want to proxy: Now we first need to create an interceptor that intercepts the method calls and caches the response: Let’s move on to the DI registration. We first need to create an extension method that triggers the proxy creation: Almost there, as a last step we need to register everything:

Azure DevOps–Set git tag on build

I was using a custom build task to add a tag to a git commit after a successful release. Turns out that this is not necessary and that you can tag a build directly in Azure DevOps pipelines. This can be done in Get Sources : There you have a Tag sources option where you can decide the tag that should be set: Simple and useful!

.NET Core - Clear IMemoryCache

.NET Core gives you 2 options to use as cache data in memory. Either you use the System.Runtime.Caching / MemoryCache ( NuGet package ) or the Microsoft.Extensions.Caching.Memory / IMemoryCache. The latter is recommended over System.Runtime.Caching / MemoryCache because it's better integrated into ASP.NET Core. For example, IMemoryCache works natively with ASP.NET Core dependency injection . Use System.Runtime.Caching / MemoryCache only as a compatibility bridge when porting code from ASP.NET 4.x to ASP.NET Core. The memory cache can be registered in ASP.NET Core through the AddMemoryCache extension method. I was looking at a way to clear the cache. However I noticed there wasn’t an appropriate method available when I took a look at the IMemoryCache interface: A search on the Internet brought me to the following suggested solution: This is not a good idea because it will iterate over all keys. Why? Take a look at the following remark in the documentation : R

Azure Durable Entities–The actor model in disguise

The Azure Functions ecosystem introduced a new member; durable entities also know as entity functions. Durable entities behave like tiny services that communicate via messages. Each entity has a unique identity and an internal state (if it exists). In that way they are quite similar to actors in the actor model. (However there are some important differences in the way they are implemented compared to Orleans or Akka.net ). There are 2 ways to create entities: The class-based syntax represents entities and operations as classes and methods. This syntax produces easily readable code and allows operations to be invoked in a type-checked manner through interfaces. The function-based syntax is a lower-level interface that represents entities as functions. It provides precise control over how the entity operations are dispatched, and how the entity state is managed. Let’s have a look at the class-based syntax as this is more familiar for people coming from another

ASP.NET Core WSFederation - Return 401 for API calls

By default when you configure the WSFederation middleware in ASP.NET Core, you will be redirected to the Identity Provider when an unauthenticated request arrives on the server.  This is what you would expect when the middleware is invoked from an ASP.NET Core MVC or Razor Pages webpage but probably not what you want when it is an API. In that case a 401 would be a better response. To achieve this you should handle the OnRedirectToIdentityProvider event and change the response to 401:

Understanding Technical Debt

To produce quality software it is really important to understand the concept of Technical Debt. Let’s see how Ward Cunningham explains it: The key phrase in his explanation is: “If you develop a program for a long period of time by only adding features but never reorganizing it to reflect your understanding of those features, then eventually that program simply does not contain any understanding and all efforts to work on it take longer and longer.” The fact is that every codebase changes. Every time we change something without fully understanding the impact of this change, we are duct-taping and in fact increasing our technical debt. A general sense of confusion builds up over time that makes it harder and harder for a developer to apply a change. You’ll end with 2 possible reactions; either the developer gives up and decides to add some extra duct tape or  demands a rewrite to get rid of this mess. The choice is yours… So be warned, if your developers are confused, it

Azure DevOps–dotnet pack gives wrong suggestion in ‘Path to csproj or nuspec file(s) to pack’

One of the available pipeline tasks in Azure DevOps is the dotnet core CLI task . Through this task you can execute the available command line options that dotnet offers. One of those command line options is ‘dotnet pack’ that allows you to create nuget packages from either csproj or nuspec files. You can specify where the task should search for these files by configuring the ‘Path to csproj or nuspec file(s) to pack’ option. However I noticed that this option didn’t work as expected. I tried to exclude some files but that didn’t work. I found out that the suggestion next to the option was wrong. The pattern described was incorrect. The documentation mentions that you should use ‘-:’ to exclude files and folder. This is wrong, instead you should use a ‘!’: Wrong pattern: **/*.csproj;-:**/*.Tests.csproj Correct pattern: **/*.csproj;!**/*.Tests.csproj

ASP.NET Core–Taghelper doesn’t work in ViewComponent

Inside my ASP.NET Core application I created a custom ViewComponent that could be used to log out a user. To make my life a little bit easier I used an ASP.NET Core TagHelper   inside my component: Unfortunately this didn’t work and instead of parsing the custom tags the TagHelpers were rendered ‘as-is’ in the HTML output: To fix it I had to explicitly include the TagHelpers inside my ViewComponent using @ addTagHelper : @using ViewComponents.Logout @addTagHelper *, Microsoft.AspNetCore.Mvc.TagHelpers

Visual Studio - No test matches the given testcase filter

When trying to review some code from a colleague, I first tried to run the available unit tests in Visual Studio. The tests were successfully discovered but when I tried to execute them, nothing happened. In the test output window I noticed the following message: No test matches the given testcase filter This error made me think that somewhere a testcase filter was set. But where? I started searching inside the available settings and config files but I couldn’t find anything special. The error message mislead me, the problem was not related to any kind of testcase filter. It turned out that the problem was caused by the fact that the NUnit Test Adapter wasn’t installed. I opened the Package Explorer and installed the NUnit3TestAdapter .

C# 8–Default!

After enabling ‘Nullable reference types’ in a C# project, I got a lot of the following warnings: CS8618 - Non-nullable property '<propertyname>' must contain a non-null value when exiting constructor. Consider declaring the property as nullable. I got this message a lot on my ViewModels that looked like this: I could follow the suggestion and mark the property as nullable as suggested by the compiler. But I know that the ViewModel is populated from the database and that the provided value could never be ‘null’. In that case the trick to get rid of the warning is to use a property assignment with a default literal :

SQL Server Management Studio - Template explorer

I’m using SQL Server Management Studio for a long time but it wasn’t until recently I discovered that it contains a feature called Template Explorer . What is it? Through Template Explorer you get access to a large list of scripts useful for a lot of scenario’s. Next time you want to google on how to execute a specific task, take a look at the Template Explorer first. There is a good chance that a script is already available. Where can I find it? To locate the templates in SQL Server Management Studio, go to View , click Template Explorer , and then select SQL Server Templates . More information: https://docs.microsoft.com/en-us/sql/ssms/template/template-explorer

Azure DevOps Server- Upgrade ElasticSearch

After upgrading your Azure DevOps Server instance, you should also upgrade your ElasticSearch instance. If the instance is running on the same server as Azure DevOps, it can be automatically updated using the Configure Search feature in the Azure DevOps Administration console. In case you are running ElasticSearch on a different machine (what I would recommend) you need to copy the Search service package to the target server. You can find the link in the wizard that is used to Configure Search: On the remote server extract the installer files and execute the following command: \Configure-TFSSearch.ps1 --operation update The installer should be able to find the existing ElasticSearch instance and update it for you. More information: https://docs.microsoft.com/en-us/azure/devops/project/search/administration

Snapper - A snapshot does not exist

The first time I ran Snapper it failed with the following error message:     Snapper.Exceptions.SnapshotDoesNotExistException : A snapshot does not exist.     Apply the [UpdateSnapshots] attribute on the test method or class and then run the test again to create a snapshot.   Stack Trace:     SnapshotAsserter.AssertSnapshot(SnapResult snapResult)     Snapper.MatchSnapshot(Object snapshot, SnapshotId snapshotId)     Snapper.MatchSnapshot(Object snapshot, String childSnapshotName)     Snapper.MatchSnapshot(Object snapshot)     SnapperExtensions.ShouldMatchSnapshot(Object snapshot)     GraphQLTests.Queries_Succeed(String queryName) line 57     --- End of stack trace from previous location where exception was thrown --- This error makes sense as Snapper expects that a snapshot exists when you run your tests. To fix this, you need to update your tests and add the [UpdateSnapshots] attribute above your test method or class. When Snapper finds th

Using XUnit Theory with Snapper

To test my GraphQL schema I first used SnapShooter . From the documentation: Snapshooter is a flexible .Net testing tool, which can be used to validate all your test results with one single assert. It creates simply a snapshot of your test result object and stores it on the file system. If the test is executed again, the test result will be compared with the stored snapshot. If the snapshot matches to the test result, the test will pass. It is really a good fit to test your GraphQL API. Unfortunately I got into trouble when I tried to combine it with XUnit Theory. I had created 1 tests that loaded multiple graphql files and validates them: This didn’t work as expected because the same snapshot is used for every test run. As a consequence it fails when the same test runs again with a different parameter value . Probably there is a way to fix it, but I was lazy so I took a look online and found Snapper , which offers similar features as SnapShooter but supports the XUnit T

Enable .NET 5 in your Azure App Service

After deploying a .NET 5 application in an Azure App Service, I got the following error message when I tried to run the application: HTTP Error 500.31 - ANCM Failed to Find Native Dependencies Common solutions to this issue: The specified version of Microsoft.NetCore.App or Microsoft.AspNetCore.App was not found. This is because in Azure App Service .NET 5 is not enabled by default (yet). Let’s see how to fix this: Open the Azure portal Go to the App Service you want to configure Click on Configuration in the Settings section Go to the Stack Settings and change the .NET Framework version to .NET 5(Early Access) Click Save Restart the App Service

.NET 5–Source Generators–Lessons Learned–Part 3

One of the new features in .NET 5 that triggered my curiosity where source generators . I did some investigation on how to use them and want to share the lessons I learned along the way. Yesterday I got my first source generator finally up and running. Now it is time to move on and do something more interesting. I found an example on Github that created a strongly typed config class based on your appsettings.json. I tried to duplicate the code but when I build my application the source generator didn’t run. In the build output I noticed the following warning: Severity Code Description Project File Line Suppression State Detail Description Warning CS8034 Unable to load Analyzer assembly c:\lib\netstandard2.0\Syst

.NET 5–Source generators–Lessons learned–Part 2

One of the new features in .NET 5 that triggered my curiosity where source generators . I did some investigation on how to use them and want to share the lessons I learned along the way. Yesterday I wrote my first generator. Unfortunately nothing happened and no new code was generated. I tried to attach the debugger but when I did a rebuild still nothing happened. It seems that the code itself was never called. It took me some time to figure out why it didn’t work; it turns out that you have to add your source generator to a separate assembly(which makes sense). In my first try I just added the source generator logic to the same project as the other code but this doesn’t work. How to package a source generator? To get source generators up and running, you need to create a separate project add the source generator logic there. So create a new .NET Standard project and add a reference to the following package: Generators can be packaged using the same method as an Analyzer

.NET 5–Source generators–Lessons learned - Part 1

One of the new features in .NET 5 that triggered my curiosity where source generators . I did some investigation on how to use them and want to share the lessons I learned along the way. But let’s first start with an explanation what “a source generator” actually is: A Source Generator is a new kind of component that C# developers can write that lets you do two major things: Retrieve a Compilation object that represents all user code that is being compiled. This object can be inspected and you can write code that works with the syntax and semantic models for the code being compiled, just like with analyzers today. Generate C# source files that can be added to a Compilation object during the course of compilation. In other words, you can provide additional source code as input to a compilation while the code is being compiled. So in short, source generators are a new compiler feature that allow you to inspect existing code and generate new code(remark: you ca

GraphQL Hot Chocolate–Enable global authentication

There are multiple ways to enable authentication in Hot Chocolate . Here is simple approach: Step 1 – Enable ASP.NET Core authentication First step is to enable authentication at ASP.NET Core level. Let’s use JWT token for authentication: Step 2- Enable authentication at the root GraphQL query The second(and already the last step) is to enable authentication on the root query type. By providing no role or policy names we’re simply saying the user must be authenticated.

Visual Studio 2019 16.8– Git Amend

With the release of Visual Studio 2019 16.8 Git became the default source control experience. It no longer lives inside Team Explorer but got its own menu item. Read the announcement for more details. One of the things I noticed when using the new Git experience was the first class support for ‘git amend’ . But what does ‘git amend’ do? Let me explain… The git commit --amend command is a convenient way to modify the most recent commit. It lets you combine staged changes with the previous commit instead of creating an entirely new commit. Amending does not just alter the most recent commit, it replaces it entirely, meaning the amended commit will be a new entity with its own ref. It is a way to rewrite the git history. Warning: Avoid amending a commit that other developers have based their work on. To be safe only amend local commits.

Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported

With the release of .NET 5 it was finally time to try record types . Here is my first attempt that I copied from the documentation: Unfortunately this turned out not the success I was hoping for. Visual Studio returned the following error message: Severity Code Description Project File Line Suppression State Error CS0518 Predefined type 'System.Runtime.CompilerServices.IsExternalInit' is not defined or imported NHibernate.Tests C:\NHibernate.Tests\Domain\Product.cs 17 Active The error is not very descriptive but the reason I got it because I forgot to update the project to .NET 5. The fix was easy; after updating the ta

CS8632 - The annotation for nullable reference types should only be used in code within a ‘#nullable’ annotations context.

I copy/pasted the following code from an example I found on the web to test source generators (but that is not where this post is about today). Building this code resulted in the following warning: Do you have any idea why? Take a look at the ‘Name’ property. Notice the usage of ‘?’.  This is part of the introduction of nullable reference types and declares the string as nullable. Nullable reference types are available beginning with C# 8.0, in code that has opted in to a nullable aware context . This is what the warning message is referring to. We need to enable nullable reference type support in our project. This can be done by adding <Nullable>enable></Nullable> in our csproj file:

The target framework 'netcoreapp3.0' is out of support and will not receive security updates in the future.

After upgrading to the latest Visual Studio version(16.8.1) that added support for .NET 5, I got a new warning when opening an existing .NET 3.0 solution: Severity Code Description Project File Line Suppression State Warning NETSDK1138 The target framework 'netcoreapp3.0' is out of support and will not receive security updates in the future. Please refer to https://aka.ms/dotnet-core-support for more information about the support policy. ExampleApp C:\Program Files\dotnet\sdk\5.0.100\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.EolTargetFrameworks.targets 28 You don’t have to panic when you see this warning. The only thing you need to be

ORM- Attaching detached object trees

One of the nice features of an ORM is that it allows you to change a complete object tree and that the ORM will figure out what needs to be added, updated and deleted. The reason why this works is because a typical ORM will track by default all objects that are attached to the ORM abstraction(DBContext in EF Core and ISession in NHibernate). Because it tracks these objects (and their relations) it knows what should happen when you make a change. Unfortunately when you are sending objects to a client through an API(typically using DTO’s and tools like Automapper ) and you get a changed object tree back, the objects in this tree are no longer attached to the ORM, so it doesn’t know what should be inserted, updated or deleted. Luckily most ORM’s are smart and use different tricks to figure out what is changed. However one place where these ORM’s typically fail is when you are using Automapper and mapping a child collection in your DTO’s to your domain model. An example below where we

Microsoft Orleans- Warning NoOpHostEnvironmentStatistics

When starting up my Orleans cluster I noticed the following warning message: warn: Orleans.Runtime.NoOpHostEnvironmentStatistics[100708]       No implementation of IHostEnvironmentStatistics was found. Load shedding will not work yet To get rid of this warning you need to include the following line in the SiloBuilder config: This will enable support for Load shedding. This kicks in when the system gets overloaded and prevents new clients from connecting.

Kafka - Avro - Value serialization error

The default serialization format for Kafka is Avro. I mentioned how to use this in C# yesterday. Today I got into trouble when trying to send a specific message. I changed the example a little bit to explicitly point out the issue. You see in the code above that I set the value for ‘Browser’ to ‘null’. When trying to send this message it failed with the following error: Local: Value serialization error Let’s have a look at the related avro schema: The problem is that in the schema is specified that the Browser field should a value of type string. ‘Null’ is not a valid value for string. This explains why it fails. To solve this I have two options; 1) Either change the code to send an empty string instead of null: 2) Either update the schema to allow null values for the browser field: More about Avro: https://en.wikipedia.org/wiki/Apache_Avro

Kafka- Using Avro as serialization format in C#

To help you with using Avro as the serialization format for your Kafka messages, a .NET core global tool avrogen is available. First install the tool using dotnet tool install: Next step is to specify your message schema. Therefore you need to create an .avsc file and add your message specification: Now it’s time to generate the necessary code: This will generate the following: The generated type can than be used by your Producer and Consumer logic. More information: https://www.confluent.io/blog/avro-kafka-data/

DDD–Strongly typed Ids-Using C#9 Record Types

I blogged before about the usage of Strongly typed Ids in your domain models. With the introduction of C#9 record types, an alternative approach becomes an option. Record types are reference types with built-in immutability and value semantics. They automatically provide implementations for Equals , GetHashCode , etc, and offer a very concise syntax known as positional records . This allows us to rewrite our ProductId type using records: public record ProductId(int Value); That’s all that is needed. Thank you C# 9!

Sending a message through Kafka - Value serializer not specified and there is no default serializer defined for type

My first attempt to send a typed message through Kafka resulted in the following error message: Value cannot be null. (Parameter 'Value serializer not specified and there is no default serializer defined for type PageViewEvent) Here is the code I was using: As the error message mentions, you need to explicitly specify what serializer should be used for your message object. Therefore you need to use the SchemaRegistryClient and specify a serializer(I’m using Avro in the sample below):

Configure a Kafka topic in C#

By default when you use the Confluent Kafka .NET client , a topic is created automatically for you when you publish your first message. However this will create a topic using the default settings. Typically you want to have more control when creating a topic. This is possible through the usage of the AdminClient:

Azure DevOps Docker build error–No such file or directory

When trying to build a new .NET core project using Azure DevOps it failed with the following error message: Step 6/16 : COPY ["DockerWebApplication1/DockerWebApplication1.csproj", "DockerWebApplication1/"] COPY failed: stat /var/lib/docker/tmp/docker-builder590338138/DockerWebApplication1/DockerWebApplication1.csproj: no such file or directory ##[error]COPY failed: stat /var/lib/docker/tmp/docker-builder590338138/DockerWebApplication1/DockerWebApplication1.csproj: no such file or director The problem is that the COPY command is using the wrong folder to COPY from. To fix it, update your Docker Build task: Uncheck the “Use Default Build Context” checkbox Set  the “Build Context” field to "$(System.DefaultWorkingDirectory)"

Software development is learning

In every software project you have 2 sides: the problem space and the solution space. The problem space is typically owned by a product owner, he or she knows the problem they want to solve. The solution space is owned by the developers, they know how to solve those problems. Easy right? Unfortunately it isn’t so easy as it sound. Product owners don’t always succeed in explaining the problem (or describe a workaround instead of the real problem). As a result, developers don’t always get it right the first time and build the wrong solution. The most important lesson you can take with you on any software project is that is all about learning. As you are learning nobody expects that you get it right from the first time. Instead somebody points you your mistakes and gives you feedback. So how can we become better in writing software? Not by typing faster and certainly not by making no mistakes. We become better by having better feedback faster . The key is shorter feedback loops; t

Domain Driven Design–Structuring your applications

There are a lot of options out there on how you should structure your DDD projects. Although DDD gives you a clear explanation of the different building blocks, it doesn’t prescribe any particular project setup or structure. So this leaves it up to you (and everyone else) to create a structure that works for you. A possible setup I used before can be found below: It is inspired on the blog articles below: Applied Domain-Driven Design (DDD), Part 1 - Basics Applied Domain-Driven Design (DDD), Part 2 - Domain events Applied Domain-Driven Design (DDD), Part 3 - Specification Pattern Applied Domain-Driven Design (DDD), Part 4 - Infrastructure Applied Domain-Driven Design (DDD), Part 5 - Domain Service Applied Domain-Driven Design (DDD), Part 6 - Application Services Applied Domain-Driven Design (DDD), Part 7 - Read Model Services Applied Domain-Driven Design (DDD), My Top 5 Best Practices Applied Domain-Driven Design (DDD), Event Logging & Sourcing For Auditing

HotChocolate - Apollo Tracing

To investigate what’s going on inside your GraphQL backend I would recommend to enable Apollo tracing. You can always enable this but as this has a (small) performance impact, a better way is to enable it on demand. In HotChocolate you can specify this through the QueryExecutionOptions: After doing that, tracing will be activated when ad GraphQL-Tracing HTTP header with a value of 1 is added to the request. Now your GraphQL result will not only contain the data but also tracing information. GraphQL tools like GraphQL Playground can visualize this data: If the tracing information is not shown, check your GraphQL Playground settings and see if "tracing.hideTracingResponse" is set to false.

SonarCloud–New Code Definition

When scanning a project for the first time through SonarCloud in your Azure DevOps build pipeline, a new SonarCloud project is created automatically. The second time your build runs a comparison is done and you’ll have your first Quality Gate results. At least that is the theory… When we tried to do this for a new project, the corresponding SonarCloud project was created. However we got the following warning when browsing to this project in SonarCloud: This is a new requirement for every SonarCloud project to specify how ‘new code’ is identified. You can choose between the following options: Previous version: All code that has changed since the previous version Number of days: All code that has changed in the last x days Specific date: All code that has changed since the specified date It can be set in the organization settings(Organization –> Administration –> New Code) and applies to all new projects. Unfortunately setting this didn’t resolve the is

Learning Docker in a month

If you are new to Docker, I can recommend the Learn Docker in a Month of Lunches YouTube series. It will cost you your lunch break but that is a small cost to pay…

Azure DevOps - Allow users to edit process templates

I got the following question from a colleague; “Is it possible to give all team members the necessary rights to edit a specific process template?” Let’s see how we can do this: Go to the root of your Azure DevOps organization (e.g. https://dev.azure.com/{organizationname}/ ) Click on Organization settings in the bottom left corner. Scroll to the Process section on the left and click on it. Click on the … next to the Process template you want to use and click on the Security Item in the context menu. Search for the team you want to add. After adding it, change the Edit process permission to “Allow”. That’s it!

Entity Framework Core –Pessimistic concurrency

After blogging about pessimistic concurrency and the usage of locks in NHibernate yesterday, I wanted to write a follow up on how to do the same thing in Entity Framework Core.  Turns out that EF Core does not support Pessimistic Concurrency… However this does not mean it is not possible. We’ll have to throw some raw SQL in the mix to get it working. First we need to explicitly create a database transaction: After doing that we need to set the lock. We do this by creating a ‘fake’ update statement with a lock statement: Now we can execute our ‘real’ update using EF Core: Here is the full code example:

DDD–Strongly typed Ids

One of the core principles of DDD is the usage of value objects to avoid “primitive obsession”. "Primitives" refer to the built-in types in C#,  int , string,guid etc. "Primitive obsession" refers to over-using these types to represent domain concepts that aren't a perfect fit. Some examples are a HouseNumber that is represented by an int or an EmailAddress that is represented by a string. This concept not only makes sense for your value objects but is also valuable for your Entity and AggregateRoot id’s. A ProductId should not be interchangeable with an OrderId. Creating a valueobject for every Id type is not that hard but remains cumbersome. Let’s introduce StronglyTypedId as a solution. From the website : StronglyTypedId makes creating strongly-typed IDs as easy as adding an attribute! No more accidentally passing arguments in the wrong order to methods - StronglyTypedId uses Roslyn-powered build-time code generation to generate the boilerplate

NHibernate–Pessimistic concurrency

In most cases locking is something that you want to avoid as it limits the level of concurrency of your database. But sometimes that is exactly what you want. You can use pessimistic concurrency in NHibernate by using an overload that takes a LockMode: When using session.Get<T>: Or when using session.Query<T>:

Microsoft MakeCode

If you want to learn your children the joy of programming, have a look at Microsoft MakeCode. From the website : Microsoft MakeCode is a free, open source platform for creating engaging computer science learning experiences that support a progression path into real-world programming. It contains a lot of simulators, tutorials and examples for multiple devices and platforms.

Azure DevOps– Lead time and cycle time

Both lead time and cycle time can be measured and visualized in Azure DevOps. Lead time is calculated from work item creation to entering a completed state. Cycle time is calculated from first entering an In Progress state to entering a Completed state as visualized below: The easiest way to get this data is by using the Cycle Time and Lead Time widgets. To enable these widgets, go to the Dashboards page in your Azure DevOps project. Click on Edit on the right. In the Add Widget search box, enter ‘cycle’ to search for the Cycle Time widget. Click on Add to add it to the dashboard On the Configuration page, you can select the team, backlog level and time period. You also have the option to further filter the backlog using field criteria. Click on Save and Close to apply the configuration. Repeat the steps above for the Lead Time widget. More information: https://docs.microsoft.com/en-us/azure/devops/report/dashboards/cycle-time-an

HotChocolate GraphQL - Integration test authorization

The HotChocolate blog gives some guidance on how to write integration tests. This is a good starting point but doesn’t help you get to a final solution when you are using authorization in your GraphQL schema. In that case you need a way to inject an authenticated ClaimsPrincipal into the GraphQL middleware. It took us some time to figure out a solution but here are the steps involved: Create custom middleware: Register middleware:

GraphQL Altair–Test and debug your GraphQL queries

As the GraphQL ecosystem keeps growing I’m continually founding new tools that make my job as a GraphQL developer easier. Recently I started using Altair , a GraphQL client application that makes it easy to test your GraphQL endpoints. A selection of the features it has to offer: Multi language/platform/window Import and export queries File upload Syntax and query highlighting Autofill all fields …

Quick tip if you want to play around with C# 9

The easiest way to start playing with C# 9 is by installing LinqPad 6 . This allows you to play with the new language features without installing any extra dependencies. To enable the new language features, go to Edit –> Preferences . Go to the Query tab and set a checkbox next to Enable C# 9 preview features :

C#9- Record types

One of the most anticipated features coming in C#9 are Record types . Record types make it easy to create immutable reference types in .NET. They become the perfect fit for Value Objects in DDD terms. A new keyword is introduced to support this: record . The record keyword makes an object immutable and behave like a value type. An example: You can even write this using a shorter syntax using positional records: But what if you want to change the person name? How can we do this knowing that the object is immutable? Do I need to copy all properties into a new object? Let’s introduce the with keyword to fix this. It allows you create an object from another by specifying what property changes:

C#9–Covariant return types

C# 9 adds a whole list of new language features. Most people talk about the support for record types and improved pattern matching but one of the features I’m happy that is finally added to the language are covariant return types . You are probably asking covariant return what????? Let’s explain; with return type covariance, you can override a base class method that has a less-specific type with one that returns a more specific type. An example, before C#9 you would have to return the base type from the inherited class: In C# 9, you can do the following:

RabbitMQ–Lazy queues

By default RabbitMQ tries to keep your whole queue in memory. This is OK as long as your messages are processed fast enough but not if your queue becomes very long(many millions of messages). Queues can become very long for various reasons: consumers are offline / have crashed / are down for maintenance there is a sudden message ingress spike, producers are outpacing consumers consumers are slower than normal Lazy Queues can help in these situations- messaged are moved to disk as early as practically possible, and are only loaded in RAM when requested by consumers.This comes at a cost of increased disk I/O. You can configure this in MassTransit when configuring your receive endpoint: More information: https://www.rabbitmq.com/lazy-queues.html

Database profiling in Visual Studio

Visual Studio 2019 version 16.3 extends the Performance Profiler with a new ‘Database’ option.  This will enable the new Database tool that captures all database activity during your profiling session. Remark: The database tool works with .NET Core projects  using either ADO.NET or Entity Framework Core . It also works on traces collected using dotnet trace which means we can collect data anywhere that .NET Core runs (including Linux!) and analyze that data in Visual Studio. Let’s try it: Open the Performance Profiler in Visual Studio by clicking Debug > Performance Profiler. Select the checkbox next to “Database” to enable the tool. Click on Start to start the profiling session. Now interact with your application in the ways you’re interested in investigating. When you are done click ‘Stop collection’. Now you get a table of all the queries that happened during your profiling session along with a graph that shows when and how many queries happen ove

Improve the startup time of your .NET Core application

Starting from .NET Core 3.x tiered compilation is enabled by default. This allows you to use precompiled code from assemblies created using the ReadyRoRun (R2R format. R2R is a form of ahead-of-time (AOT) compilation. It improves startup performance by reducing the amount of work the just-in-time (JIT) compiler needs to do as your application loads. To use this feature you need to enable ReadyToRun and publish your application as a self-contained app. Add the <PublishReadyToRun> setting to your project: <PropertyGroup>      <PublishReadyToRun>true</PublishReadyToRun> </PropertyGroup> Publish a self-contained app. Here is an example targetting the Linux ARM64 runtime: dotnet publish -c Release –r linux-arm64 --self-contained

.NET Core Plugin architecture

A plugin architecture remains a valuable option to make your application extensible. With the introduction of AssemblyLoadContext and AssemblyDependencyResolver in .NET Core 3, creating and loading plugins became a lot easier. Still if you need more features I would recommend having a look at the DotNetCorePlugins project: https://github.com/natemcmaster/DotNetCorePlugins Usage is simple through the PluginLoader class: One nice feature it adds to the mix is hot reloading. This will allow you to dynamically update assemblies on the fly:

Azure DevOps–Dependency Tracker

When working with multiple teams in Azure DevOps that have cross team dependencies, it can become quite hard to track what is going on. If you recognize this problem I would recommend having a look at the Dependency Tracker extension for Azure DevOps: https://marketplace.visualstudio.com/items?itemName=ms-eswm.dependencytracker&ssr=false#overview The Dependency Tracker extension provides you with the ability to plan and manage dependencies across teams, projects, and even organizations. It provides filterable views to show all dependencies a team is consuming and producing. You can see both producting and consuming dependencies: And there is even a timeline view available: The easiest way to manage your dependencies is to use the predecessor/successor link type to link work items together: More information: https://docs.microsoft.com/en-us/azure/devops/boards/extensions/dependency-tracker

Angular 10–Strict mode

With the release of Angular 10 a more strict project setup is available. You can enable this by adding a –strict flag when creating a new Angular project: ng new --strict This strict mode allows to have more build-time optimizations, help to catch bugs faster and improves overall maintainability of your Angular application. Therefore it applies the following changes: Enables strict mode in TypeScript Turns template type checking to Strict Default bundle budgets have been reduced by ~75% Configures linting rules to prevent declarations of type any Configures your app as side-effect free to enable more advanced tree-shaking There is no out-of-the box option to enable this for an existing project(after creating it without the –strict flag or after upgrading from a previous version) but you can apply the same changes manually: Add the following rules to your tsconfig.json : "compilerOptions": { "strict": true, "fo

Computer stuff they didn’t teach you…

It is always fun to see Scott Hanselman in action. And in this video series he explains a lot of IT concepts in simple and concise way. A good way to end your work week…

GraphQL–The query has non-mergable fields

I constructed the following query in GraphQL playground: In this query I try combine 2 calls, one to fetch a product by id and another one where I try to fetch a product by name. However when I tried to execute this query it resulted in the following error message: The query has non-mergable fields The problem is that GraphQL tries to merge the results in one product result object which does not work. To have the behavior I want I need to specify an alias for every query: Now when I execute the query, the query results are put in their own object: