Skip to main content


Showing posts from 2023

NuGet 6.8–Package vulnerability notifications

Starting from 6.8 , NuGet will audit PackageReference packages and warn you if any have known vulnerabilities similar to what NPM does when using npm install . This works when using dotnet restore: And also when using Visual Studio: Nice! More information Auditing package dependencies for security vulnerabilities | Microsoft Learn

Git–Discard local changes

Git offers a wide range of possibilities to discard local changes. In this post I like to share some of the available options. Before I dive into the details, it is important to make the distinction between untracked and tracked files. From the documentation : Git has something called the "staging area" or "index". This is an intermediate area where commits can be formatted and reviewed before completing the commit.   Untracked files live in the git working directory but are not managed by git until you stage them. Tracked files Here are some options to discard changes in tracked files: Discard Changes in a Specific File: If you want to discard changes in a specific file, you can use the following command: git checkout -- filename This will replace the changes in the specified file with the last committed version of that file. Discard Changes in All Modified Files: To discard changes in all modified files in the workin

.NET 8–JSON Source Generator improvements

If you don’t know what the JSON source generator is, I suggest to first check this older post before you continue reading. Still there? OK! Today I want to focus on some improvements in the JSON source generator that were introduced with the release of .NET 8. Although the JSON source generator is a great (performance) improvement, it couldn’t handle some of the more recent language features like required members and init-only properties . If you tried to use these features in combination with source generators you get the following warning before .NET 8: Starting from .NET, full support for required and init members has been added and the code above will just work:

MassTransit–Quorum queues

Mirrored queues have been a feature in RabbitMQ for quite some time. When using mirrored queues messages are replicated across multiple nodes providing high availability in a RabbitMQ cluster. Each mirrored queue has a master and one or more mirrors, and messages are replicated from the master to its mirrors. Mirrored Queues operate on synchronous replication, meaning that the master node waits for at least one mirror to acknowledge the receipt of a message before considering it successfully delivered. This impacts performance and can result in throughput issues due to the synchronous nature of replication. Certain failure scenarios can result in mirrored queues confirming messages too early, potentially resulting in a data loss. Quorum queues Quorum Queues are a more recent addition to RabbitMQ, introduced to address some of the limitations posed by Mirrored Queues. They use a different replication model based on the Raft consensus algorithm. In this model, each queue is repli

.NET 8 and C# 12–Overview

If you want to see how .NET and C# evolved over time, check out the updated overview created by nietras : Check out his post for more details and a PDF version of the image above.

Company vs company

In the English vocabulary, the word 'company' has 2 meanings: Company: an organization that sells goods or services in order to make money And Company: the fact of being with a person or people, or the person or people you are with I think it is no coincidence that the same word is used for both explanations. Origin The word "company" traces its origins back to the Latin term "com-" meaning "together with" and "panis" meaning "bread" . In its earliest usage, "company" referred to a group of people who shared meals together, highlighting the communal aspect of coming together around a common table. This initial meaning laid the foundation for the word's dual evolution, branching into both social and business contexts. Social Company: In its more informal sense, "company" refers to a gathering of individuals for social interaction or mutual enjoyment. The shared origin with

ADFS - MSIS5007: The caller authorization failed for caller identity

Our ASP.NET Core applications typically use WS-Federation with ADFS as our Identity Provider. After configuring a new application(Relying Party) in ADFS the first authentication attempt failed with the following error message: Encountered error during federation passive request. A look at the event viewer gave us more details: Protocol Name: wsfed Relying Party: https://localhost/example/ Exception details: Microsoft.IdentityServer.RequestFailedException: MSIS7012: An error occurred while processing the request. Contact your administrator for details. ---> Microsoft.IdentityServer.Service.IssuancePipeline.CallerAuthorizationException: MSIS5007: The caller authorization failed for caller identity <domain>\<ADUser> for relying party trust https://localhost/example/ .    at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result)    at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result)    at Microsoft.Id

Find a subset from a set of values whose sum is closest to a specific value–C#

I got an interesting question from my girlfriend last week: Given I have a list of numbers, I want to select a subset of numbers that added up matches closest to a specific (positive) value. Let me give a simplified example to explain what she was asking for: If our list is [12, 79, 99, 91, 81, 47] and the expected value is 150, it should return [12, 91, 47] as 12+91+47 is 150. If our list is [15, 79, 99, 6, 69, 82, 32] and the expected value is 150 it should return [69, 82] as 69+82 is 151, and there is no subset whose sum is 150. This turns out to be known as the Subset sum problem and is a computational hard problem to solve. Luckily the list of numbers she needs to work with is quite small (about 50 numbers) and we can easily brute force this. Yesterday I explained how this problem can be solved in Excel, but what is the fun in that?! Let us have a look on how we can do this in C#. With some help of Github Copilot I came up with the following solution: Let us

Find a subset from a set of values whose sum is closest to a specific value– Excel

I got an interesting question from my girlfriend last week: Given I have a list of numbers, I want to select a subset of numbers that added up matches closest to a specific (positive) value. Let me give a simplified example to explain what she was asking for: If our list is [12, 79, 99, 91, 81, 47] and the expected value is 150, it should return [12, 91, 47] as 12+91+47 is 150. If our list is [15, 79, 99, 6, 69, 82, 32] and the expected value is 150 it should return [69, 82] as 69+82 is 151, and there is no subset whose sum is 150. This turns out to be known as the Subset sum problem and is a computational hard problem to solve. Luckily the list of numbers she needs to work with is quite small (about 50 numbers) and we can easily brute force this. Today I want to show you how we can tackle this problem in Excel using the Solver add-in. Activate the Solver Add-In: Go to the "File" tab. Click on "Options." In the Excel Opti

PowerBI–Limit the amount of imported data

For an Azure Fabric based data platform we are building I'm evaluating PowerBI as a possible reporting solution. My knowledge of PowerBI is almost non existant so there are a lot of things I learned along the way. Yesterday I talked about the different data fetching strategies. Today I want to focus on how to limit the amount of data. I explained yesterday that PowerBI can handle large amounts of data. This is good news but during development I want to work with a smaller dataset to limit the file size and gain some extra development speed. Limit the amount of data Let’s find out how we can get limit the amount of data in PowerBI. Start by opening the Query Editor view in PowerBI by selecting Transform data from the Home tab: This will load the Query Editor . Here we need to create a new parameter that can be used as a limit value. Click on Manage Parameters and choose New Parameter : We create a LimitRows parameter with a default value of 10: Now that we have

PowerBI– How to fetch your data?

For an Azure Fabric based data platform we are building I'm evaluating PowerBI as a possible reporting solution. My knowledge of PowerBI is almost non existant so there are a lot of things I learned along the way. My first lesson learned: There are multiple ways to fetch data in PowerBI. I learned this lesson quite fast when I tried to connect to my first datasource. After selecting the ‘Import data from database’ option I immediately got  to choose a Data Connectivity mode : The available options were: Import DirectQuery Let’s find out what each of these options mean and what exactly is the difference. Import mode This mode should be your first choice. As the name implies in this mode the data is imported in the report. When querying the data everything is loaded into memory which gives really fast query performance. This sounds a good idea for really small datasets but what if you have a lot of data? PowerBI makes it possible to work with big dataset

C#–Declaring attributes on positional record types

In C# 9 record types were introduced. A record in C# is a class or struct that provides special syntax and behavior for working with data models. What makes them different from a ‘normal’ class or struct is that they offer the following functionality: Value equality: Two record types are equal if the types match and all property and fields match Immutability: You cannot change any property or field value after instantiation You have 2 ways to define a record type. One way is similar to what you already know when creating classes or structs: A second way is through positional parameters: Behind the scenes, the compiler does a lot of work for us as it creates: A public autoimplemented property for each positional parameter provided in the record declaration A primary constructor whose parameters match the positional parameters on the record declaration.   A Deconstruct method with an out parameter for each positional parameter provided in the r

.NET 8–Http Logging

In .NET 7 and before the default request logging in ASP.NET Core is quite noisy, with multiple events emitted per request. That is one of the reasons why I use the Serilog.AspNetCore package . By adding the following line to my ASP.NET Core application I can reduce the number of events to 1 per request. The result looks like this in Seq :   Starting from .NET 8 the HTTP logging middleware has several new capabilities and we no longer need Serilog to achieve the same result. By configuring the following 2 options, we could achieve this: HttpLoggingFields.Duration : When enabled, this emits a new log at the end of the request/response measuring the total time in milliseconds taken for processing. This has been added to the HttpLoggingFields.All set. HttpLoggingOptions.CombineLogs : When enabled, the middleware will consolidate all of its enabled logs for a request/response into one log at the end. This includes the request, request body, response, response body, an

Use the index, Luke!

While investigating some database related performance issues, I discovered the following website:   This site explains everything you as a developer need to know about SQL indexing. Use The Index, Luke is the free web-edition of SQL Performance Explained . It presents indexing in a vendor agnostic fashion but also share product specific guidelines for DB2, MySQL, Oracle, PostgreSQL and SQL Server. Do you think you already know everything you need to know about SQL indexing? Confirm your skills by taking this short 3 minute test: 3-Minute Test: What do you know about SQL performance? I could not say that I did so great:

ADFS–ID4216 error

After creating a new claims rule, our ADFS instance started to return the following error message: The new rule we created just pass an existing claim value: c:[Type == "urn:be:vlaanderen:acm:rrn"]    => issue(claim = c); To explain why this error happened I have to give some extra context. Our ADFS instance is federated with another Identity Provider STS(IP-STS) and is acting as a resource STS(R-STS). The communication between the IP-STS and the R-STS(our ADFS instance) is done through the SAML 2.0 protocol and the tokens returned are also in SAML 2.0 format. However the communication between the R-STS and the relying party(an ASP.NET Core application) is done through WS-Federation and the token format used there is SAML 1.1 . SAML tokens have URI (ClaimType) rules that will differ based on the version of the SAML token you intend to issue. AD FS 2.0 supports WS-Federation, WS-Trust, and SAML 2.0 protocols. WS-Federation protocol only supports SAML 1.1 tokens.

ADFS Claim rules

ADFS has the concept of claim rules which allow you to enumerate, add, delete, and modify claims. This is useful when you want for example introduce extra claims (based on data in a database or AD) or transform incoming claims. Although the available documentation is already helpful, I still find it a challenge to write my own claim rules. So therefore this post… Claim rule components To start a claim rule consists of 2 parts, separated by the “=>” operator: An optional(!) condition An issuance statement So both these rules are correct: Rule #1: => issue(type = "https://test/role", value = "employee"); Rule #2: c:[type == "https://test/employee", value == "true"] => issue(type = "https://test/role", value = "employee") The first rule will always generate a new outgoing claim of type https://test/role . The second rule will only generate a new outgoing claim of type https://t

Azure Pipelines - Nuget - Unable to get local issuer certificate

For an unknown reason some of our builds started to fail suddenly. A look at the build log, showed us the following error message: ##[error]Error: unable to get local issuer certificate An error message we had seen before… The error occurred only on one of our build agents when trying to download a newer nuget version as currently available. We couldn’t find a good solution but as a workaround we manually downloaded the latest nuget version and copied it to the tools folder at: <agent directory>\_work\_tool\NuGet

The hub-and- spoke network topology for developers

A common network topology used when building cloud infrastructures is the hub-and-spoke model. As I noticed that this model is not always well understood by developers, here is my attempt to describe this model from a developers viewpoint. What Is a Hub-and-Spoke Network Topology? A hub-and-spoke network topology is a type of network architecture where all devices are connected to a central hub. This central hub acts as a focal point for communication, and all the spokes (devices) communicate through this central hub. Think of it as the hub being the center of a wheel, with spokes radiating outward to various endpoints. This topology is also commonly referred to as a star topology. As the name suggest, this topology has 2 key components: Hub : The central hub is a device that serves as the core of the network. It can be a physical device like a switch or a logical point in the network. The hub is responsible for managing and directing the flow of data between all connec

Add custom properties to ILogger

By default when logging messages through ILogger only objects that have a placeholder in the message template are logged. For example, when executing the following line of code: the following information is logged: This is something that you can also notice when you check the warning you get when hovering over the message:   But what if you want to log these extra properties? One way to get this done is by using scopes to add custom properties. Typically this is used for multiple log entries, but it also works for single log statements:   More information: Logging in C# - .NET | Microsoft Learn

Property based testing in C#–How to change the number of runs?

After gaving my talk about Property Based Testing , I was contacted by one of the participants with the following question: How to change the number of test runs? In case you have no idea what I’m talking about; by default the property based testing library( FSCheck in my case) will generate a number of inputs and run the tests  for all these inputs. By default 100 inputs are generated and tested:   You can change the number of test runs either by setting the MaxTest property on the [Property] attribute: Or by passing a configuration object when calling Prop.ForAll

VISUG Property based testing in C#–Slides and demos

Yesterday I gave a presentation at the Belgian Visual Studio user group( VISUG ). In case you couldn't be there or you would like to have my slides and demos, here is all the material: Slides: Demos: wullemsb/propertybasedtestingdemo: Demo's for the property based testing presentation (   Also check out my blog series about Property Based Testing: Part 1 – Introduction Part 2 – An example Part 3 – Finding edge cases Part 4 – Writing your own generators Part 5 – Locking input

NuGet.CommandLine.CommandLineException: Error parsing solution file

You have to love these days when everything seems to go haywire at the same time. Your code no longer compiles, your CI build status turns red and your unit tests become flaky. Today turned out to be one of such days as our build started to fail with the following error message: System.AggregateException: One or more errors occurred. ---> NuGet.CommandLine.CommandLineException: Error parsing solution file at D:\b\3\_work\154\s\Source\MyApp\MyApp.sln: Exception has been thrown by the target of an invocation. In our pipeline we were using the NuGetToolnstaller task: - task: NuGetToolInstaller@1 inputs: versionSpec: '4.x' We had just installed the latest version of the Visual Studio Build tools on our build server. This new version has a breaking change with regard to older versions of NuGet. As a fix, we switched to a newer NuGet version: - task: NuGetToolInstaller@1 inputs: versionSpec: '5.x' That’s it!

Rearchitecture is a sign of success not failure

An important lesson for a software architect is to design the system for the requirements you have today not for what a possible future could look like. Generated by Dall-E 2 A lot of systems are built with accidental complexity that could have been avoided. I’ve seen really complex high performant code for a system that is used by 10 users, a microservices solution containing 40+ services all maintained by one small team, a real-time event source setup for a system that is mostly CRUD and I can keep going… There is a reason that Martin Fowler talks about a monolith first approach . So if you need to built a system that should support 50 customers, don’t build one that can support 50.000 customers. Start small and wait until you have that many customers that they start to overload your system.  And that is great news because it means that you are successful! The truth is the majority of applications are never going to reach that stage. And even if you do start to get overloaded i

.NET 6 - Parallel.ForEachAsync

You maybe used Parallel.ForEach() before. It allows to iterate over a collection in a parallel way. It works similar to a Parallel.For loop. The loop partitions the source collection and schedules the work on multiple threads based  on the available processors in a system. Unfortunately the Parallel.ForEach() cannot be used for asynchronous work. Async vs parallel It is important to understand that "async" and "parallel" are two different concepts. Although they are both related to concurrent programming, they serve different purposes and are used in different contexts. Async is used to make non-blocking I/O operations and asynchronous code execution. It is primarily used for tasks that may take some time to complete, like reading from a file, making a network request, or performing database operations. Parallel however refers to parallel programming, which is about executing multiple tasks or operations simultaneously to improve performance and take

Login failed for user ''

When trying to connect to an Azure SQL database using Azure Data Studio, the connection failed with the following error message: Login failed for user '<token-identified principal>' I’m trying to connect using Azure Active Directory:   Although the error itself was not very clear to me, the error happened because the AAD user I’m using to connect does not exist as a user in the Azure SQL database. To fix this, we need to create the user first. This can be done using the following command: CREATE USER <Azure_AD_principal_name> FROM EXTERNAL PROVIDER; After executing this command, you should be able to connect. Of course, we still need to assign a specific role before we can do anything useful. ALTER ROLE db_datareader ADD MEMBER <Azure_AD_principal_name>; More information Create contained database users in your database mapped to Azure AD identities

Source Generator playground

If you are new to source generators and want to experiment with its possibilities, I can recommend the Source Generator Playground . This Blazor app gives you a simple console application and a source generator and allows you to observe the generated output. You can try it out live here:   They are some sample generators available that you can start modifying to see what happens: This is a great way to learn what is possible.

GraphQL Conf 2023

The 2023 edition of GraphQLConf is over. In case you had to miss this great experience, you can still watch all session recordings and get the speaker slides by going to the event schedule .   It is great to see how the GraphQL ecosystem keeps growing and what new innovations and tools are appearing. Start by watching Uri Goldshtein's keynote on the state of GraphQL focused on highlighting the biggest open source innovations from the last year. Then have a look at the GraphQL Fusion presentation given by Michael Staib , introducing a new open spec for remote execution and federation of services that challenges the existing patterns of specification-less Federation protocols with a fresh approach. Learn how GrapQL and CQRS can be combined , how AI is also finding its way in GraphQL land and much more….

Gall’s law and how it applies to software architecture

When it comes to software architecture, complexity can quickly become a daunting adversary. In the ever-evolving world of technology, designing software systems that are not only functional but also maintainable and scalable is a constant challenge. That's where Gall's Law comes into play. Named after John Gall , an American pediatrician, and systems theorist, this principle sheds light on the fundamental idea that underlies software architecture: "A complex system that works is invariably found to have evolved from a simple system that worked." In this post, we'll delve into Gall's Law and explore its profound implications for software architecture. Generated by Dall-E 2 Understanding Gall's Law John Gall's Law, often paraphrased as "Complex systems evolve from simple systems that worked" encapsulates a deep insight into how complex systems come into existence. Gall originally formulated this law in the context of systems theory

Simplify Source Generator creation the RoslynGenerator template

I’m currently writing my own Source generator which can become quite challenge. Yesterday I talked about RoslynQuoter as one of the tools that helped me along the way. Today I want to show you an easy way to get started thanks to the F0.Templates project . Start by installing the F0.templates using dotnet new --install F0.Templates : Now create a new project using the installed RoslynGenerator template dotnet new roslyngenerator . This will create not only a SourceGenerator project but also a unit and integration tests project and a benchmark project: Inside the projects we find a simple example of an incremental source generator that searches for methods with the [HelloWorld] attribute and generates an implementation: Remark: If you are looking for a list of existing generators, take a look here: amis92/csharp-source-generators: A list of C# Source Generators (not necessarily awesome) and associated resources: articles, talks, demos. ( Learn more