Skip to main content

Posts

Showing posts from September, 2020

.NET Core Plugin architecture

A plugin architecture remains a valuable option to make your application extensible. With the introduction of AssemblyLoadContext and AssemblyDependencyResolver in .NET Core 3, creating and loading plugins became a lot easier. Still if you need more features I would recommend having a look at the DotNetCorePlugins project: https://github.com/natemcmaster/DotNetCorePlugins Usage is simple through the PluginLoader class: One nice feature it adds to the mix is hot reloading. This will allow you to dynamically update assemblies on the fly:

Azure DevOps–Dependency Tracker

When working with multiple teams in Azure DevOps that have cross team dependencies, it can become quite hard to track what is going on. If you recognize this problem I would recommend having a look at the Dependency Tracker extension for Azure DevOps: https://marketplace.visualstudio.com/items?itemName=ms-eswm.dependencytracker&ssr=false#overview The Dependency Tracker extension provides you with the ability to plan and manage dependencies across teams, projects, and even organizations. It provides filterable views to show all dependencies a team is consuming and producing. You can see both producting and consuming dependencies: And there is even a timeline view available: The easiest way to manage your dependencies is to use the predecessor/successor link type to link work items together: More information: https://docs.microsoft.com/en-us/azure/devops/boards/extensions/dependency-tracker

Angular 10–Strict mode

With the release of Angular 10 a more strict project setup is available. You can enable this by adding a –strict flag when creating a new Angular project: ng new --strict This strict mode allows to have more build-time optimizations, help to catch bugs faster and improves overall maintainability of your Angular application. Therefore it applies the following changes: Enables strict mode in TypeScript Turns template type checking to Strict Default bundle budgets have been reduced by ~75% Configures linting rules to prevent declarations of type any Configures your app as side-effect free to enable more advanced tree-shaking There is no out-of-the box option to enable this for an existing project(after creating it without the –strict flag or after upgrading from a previous version) but you can apply the same changes manually: Add the following rules to your tsconfig.json : "compilerOptions": { "strict": true, "fo

Computer stuff they didn’t teach you…

It is always fun to see Scott Hanselman in action. And in this video series he explains a lot of IT concepts in simple and concise way. A good way to end your work week…

GraphQL–The query has non-mergable fields

I constructed the following query in GraphQL playground: In this query I try combine 2 calls, one to fetch a product by id and another one where I try to fetch a product by name. However when I tried to execute this query it resulted in the following error message: The query has non-mergable fields The problem is that GraphQL tries to merge the results in one product result object which does not work. To have the behavior I want I need to specify an alias for every query: Now when I execute the query, the query results are put in their own object:

Error when running Sonar Cloud on Angular projects

And the story continues… After setting up Sonar Cloud integration in Azure DevOps and applying a fix for .NET Core applications, I tried to do the same thing for our Angular code. It didn’t work as expected, this is the output I got: [More Information]( https://sonarcloud.io/documentation/analysis/scan/sonarscanner-for-azure-devops/ ) ============================================================================== D:\a\_tasks\SonarCloudPrepare_14d9cde6-c1da-4d55-aa01-2965cd301255\1.12.0\classic-sonar-scanner-msbuild\SonarScanner.MSBuild.exe end SonarScanner for MSBuild 4.10 Using the .NET Framework version of the Scanner for MSBuild Post-processing started. 18:04:47.123 Fetching code coverage report information from TFS... 18:04:47.125 Attempting to locate a test results (.trx) file... 18:04:47.64 Looking for TRX files in: D:\a\1\TestResults 18:04:47.64 No test results files found 18:04:48.125 Did not find any binary coverage files in th

Error when running Sonar Cloud on .NET Core projects

Yesterday I talked about setting up Sonar Cloud for code analysis. Unfortunately it turned out to be not so simple as I expected. The first time I ran the build pipeline the Code analysis task failed with the following error: 2020-09-14T17:47:52.6441174Z ##[section]Starting: Run Code Analysis 2020-09-14T17:47:52.6563714Z ============================================================================== 2020-09-14T17:47:52.6564026Z Task         : Run Code Analysis 2020-09-14T17:47:52.6564329Z Description  : Run scanner and upload the results to the SonarCloud server. 2020-09-14T17:47:52.6564583Z Version      : 1.15.0 2020-09-14T17:47:52.6564794Z Author       : sonarsource 2020-09-14T17:47:52.6565319Z Help         : Version: 1.15.0. This task is not needed for Maven and Gradle projects since the scanner should be run as part of the build. [More Information]( https://sonarcloud.io/documentation/analysis/scan/sonarscanner-for-azure-devops/ ) 2020-09-14T1

Running code analysis through Sonar Cloud in Azure DevOps

Sonar Cloud is the SaaS version of SonarQube , a static code analyzer. It can inspect your code against a set of quality standards, detect bugs, security vulnerabilities,  calculate technical debt and see how your code quality evolves over time. If you want to use it in Azure DevOps you should first install the SonarCloud extension from the marketplace: https://marketplace.visualstudio.com/items?itemName=SonarSource.sonarcloud After the extension is installed you get 3 new build tasks: Remark: Notice that this are not the same build tasks as should be used when using SonarQube(!) Let’s create a build pipeline that uses these tasks: First add the Prepare analysis on SonarCloud task. This task should be added to the beginning of your pipeline.  In this task you should configure the SonarCloud Service Endpoint, specify an Organization and set a Project Key and Project Name. This information will be used to create a new project inside SonarCloud. A last impo

Building a producer/consumer pipeline in .NET Core using Open.ChannelExtensions

One of the lesser known features of .NET Core is System.Threading.Channels . It allows you to implement a pipeline of producers/consumers without having to worry about locking, concurrency and so on. For an introduction have a look here; https://devblogs.microsoft.com/dotnet/an-introduction-to-system-threading-channels/ Although it would be a good solution for a lot of use cases, I don’t see it used that often. I think the main reason is that the API is not that intuitive and it takes some time to figure out how to use it. Let’s have a look at an example; (I borrowed it from Sacha Barb’s great introduction about System.Threading.Channels): Although this example is rather trivial, it takes some time to wrap your head around it and understand what is going on.  Let’s see if we can simplify this example through the Open.ChannelExtensions . This library offers a set of extensions for optimizing/simplifying System.Threading.Channels usage. Here is the simplified code:

MassTransit–Reading a header using middleware

After blogging about adding a header yesterday, today let’s have a look at how we can read out this header in a generic way(if you only want to do this for a specific consumer, take a look at this blog post from last week). We’ll start by creating the filter: Now we need to find a way to register this filter in a generic way so that it is applied for every consumer. The trick is to create an IConsumerConfigurationObserver implementation. This observer will be applied for every consumer that is configured. A perfect fit for our use case: Of course we are not there yet. We still need a way to tell MassTransit to apply this observer. This can be done by calling the ConnectConsumerConfigurationObserver method on the IBusFactoryConfigurator :

MassTransit–Adding a header using middleware

MassTransit offers a lot of flexibility and it is easy to tweak it to your needs. With all this power it is not always easy to find out what is the best way to solve a specific problem. I wanted to include a header with information about the user in every message. I solved it by adding two filters; one for sending messages and one for publishing messages: Inside these filters I resolved an IUserFactory which gave me access to the underlying user. This is how I implemented this in ASP.NET Core: I registered both filters in the IoC container(Autofac in our case) to make sure that the dependencies are correctly resolved: And as a last step I configured MassTransit to use these filters:

C# 8–IAsyncDisposable

The IDisposable interface has always been an important cornerstone to correctly cleaning up resources when you no longer needed them. Together with the using statement it became an easy pattern to implement and use in C#. Unfortunately you couldn’t use the IDisposable when you had some async work to do to cleanup your resources. This is solved with the introduction of the IAsyncDisposable interface in C# 8. Be aware that when you implement the IAsyncDisposable interface, and your class is not sealed you should not only implement the interface but also provide a second method DisposeAsyncCore with the following signature: This is to help guarantee that all resources are cleaned up correctly when your class is used as a base class. Your code can than be used in an ‘await using’: More information: Implementing a dispose async method

ASP.NET Core–Validate claims based on controller-action

For a CRUD style application we created some base controllers that encapsulate most of the logic. For every action create/read/update/delete a different claim applies and we would like to add an authorizationfilter to the base action methods. To handle this scenario, we created a custom authorizationfilter that uses the controllername and actionname to check for a specific claim:

.NET Core Tools

Starting with .NET Core 2.1 Microsoft introduced the Dotnet Tools platform as a simple way to consume existing (small) .NET Core applications through NuGet. In a way it is quite similar to installing tools through NPM. You can see the tools you already installed by executing the following command: Dotnet Tool list –g This should output the list of global tools: C:\Users\BaWu>dotnet tool list -g Package Id         Version          Commands ------------------------------------------------- dotnet-ef          3.0.0            dotnet-ef dotnet-format      3.3.111304       dotnet-format dotnet-try         1.0.19553.4      dotnet-try Question is how can we easily find tools that are out there? Go to nuget.org , click on Packages and hit the Filter button to see a list of filter options. One of the options you have is to filter on .NET Tool Package type . That should do the trick… More information here: https://docs.microsof

GraphQL - Faker

A great way to mock your GraphQL API (or extend an existing API) is through GraphQL Faker. It can act as a proxy between your existing API and your app making it really easy to try out new use cases and develop your frontend and backend in parallel. Installation To use it, first install it through NPM: npm install -g graphql-faker Usage Now you can either create a completely new GraphQL api through: graphql-faker --open Or extend an existing API: graphql-faker ./ext-swapi.graphql --extend http://swapi.apis.guru Now you can open up the editor at http://localhost:9002/editor/ and start using faker directives in our GraphQL schema. You can use the @fake directive to specify how to fake data, @listLength to specify number of returned array items and the @examples directive to provide concrete examples: Save the model. Now you can browse to http://localhost:9002/graphql and start exploring your fake GraphQL schema.

Domain Driven Design–3 rules to help you get started

I recently started a new project where we are using the DDD principles to build our software. As most of my team members are new to DDD, I noticed some obvious mistakes. Here are some rules that can help you avoid the same mistakes… Rule #1: Your domain should be modelled first When you start a new project spend enough time on the domain before introducing a frontend and a database. Try to identify your bounded contexts, what are your entities? What can be a good aggregate? Which domain events are needed? This allows you to really focus on the domain itself and explore it in depth. This is also a perfect time to introduce TDD(Test Driven Development) in your team. Rule #2: Your domain should reflect your business logic not your database If you start with Rule #1 you are already on the correct path. In too much cases I see that the domain is immediately setup in such a way that it works with the ORM tool the team has chosen. If the ORM needs getters and setters, they are adde

MassTransit–Create a scoped filter that shares the scope with a consumer

Here is what I wanted to achieve: I want a MassTransit filter that reads out some user information from a message header and store it in a scoped class. Idea is that the same scope is applicable inside my consumer so that I’m guaranteed that the correct audit information is written into the database. This use case doesn’t sound too hard and a first look at the documentation showed that you can have scoped filters; https://masstransit-project.com/advanced/middleware/scoped.html#available-context-types Unfortunately these filters doesn’t share the same scope with the consumer and turn out not to be a good solution for my use case. As I mentioned yesterday I discovered the way to go when having a look at the Masstransit pipeline: What I need is an implementation of a ConsumerConsumeContext filter(quite a mouthful). Let’s take a look at the steps involved: First we create a ConsumerConsumeContext filter. Notice that to get it working I had to use the Service

MassTransit–Receive pipeline

The last few days I have been struggling with middleware and filters in MassTransit. I had a hard time understanding what was really going on. Until I stumbled over the following image in the documentation : I wished I had seen this image anytime sooner. It would have saved me a lot of time! As a sidenote; it is a good example how the right amount of documentation can really make a difference. I got into the habit to spend more time on documenting what I’m building and this not only helps me to structure my thoughts but also helps a lot in the design process and even made some flaws in my reasoning visible early. So my advice for today; document more. (I hope my team is reading this)

.NET Core Nuget - Showing a readme file after restore

Iin .NET Core and .NET Standard projects content or tools distributed as part of a NuGet package are no longer installed. However there is one (hidden?) feature that still works and that is to show a readme file. Let’s see how to get this done: Add a readme.txt file to the project that you package through nuget. Open the .csproj filed and add the following entry: This will embed a readme.txt file in the root folder of the package. When the package is restored, the file gets displayed by Visual Studio:

GraphQL–HotChocolate–How to handle an incomplete schema

HotChocolate does a lot of effort to validate that you produced a valid schema before your application starts up. This is useful when a feature is ready but during development it can be annoying as maybe you already want to run your application before having resolvers ready for every field in your schema. By default when you try to run with an incomplete schema, you’ll get an error message similar to the following: An error occurred while starting the application. SchemaException: The schema builder was unable to identify the query type of the schema. Either specify which type is the query type or set the schema builder to non-strict validation mode. HotChocolate.SchemaBuilder.Create() in SchemaBuilder.Create.cs, line 0 · SchemaException: The schema builder was unable to identify the query type of the schema. Either specify which type is the query type or set the schema builder to non-strict validation mode. o HotChocolate.SchemaBuilder.Create() in SchemaBuilder

Azure DevOps - Improve build time

During our last retrospective, one of the complaints was that the build time was too long. We were using the Hosted Build Services on Azure and restoring the NPM packages on every build took way too long. Fortunately, Azure DevOps now offers a new build task called cache task . This task will restore the cached data based on the provided inputs. Let’s see how we can use this task to cache our NPM packages: Add the cache task to your build steps Now we first need to set a cache key. The cache key is be used as the identifier for the cache and can be a combination of string values, file paths and file patterns. In our case we want to use the package-lock.json file as the key. Every time the package-lock.json changes the cache will be invalidated and the npm packages will be restored again. **/package-lock.json, !**/node_modules/**/package-lock.json, !**/.*/**/package-lock.json The second thing we need to set is the directory to populate the cache from and r

ASP.NET Core Swagger error - Conflicting method/path combination

After adding a new method to a controller, my OpenAPI(Swagger) endpoint started to complain with the following error message: Conflicting method/path combination "POST api/Orders" for actions - eShopExample.Web.Controllers.OrdersController.CreateOrder (eShopExample),eShopExample.Web.Controllers.OrdersController.ProcessOrders (eShopExample). Actions require a unique method/path combination for Swagger/OpenAPI 3.0. Use ConflictingActionsResolver as a workaround I find the error message itself not very insightful but taking a look at my OrdersController made it all clear: The problem was that I had 2 methods that were both using attribute based routing(through the [HttpPost] attribute) but where resolved to the same URI; “api/orders”. To fix it I had to use an overload of the [HttpPost] attribute and specify an alternative URI: