Skip to main content

Posts

Showing posts from July, 2024

EF Core - The conversion of a datetime2 data type to a datetime data type resulted in an out-of-range value

Athough EF Core is a developer friendly Object-Relational Mapper (ORM), working with it isn't without its challenges. One error that we encountered during a pair programming session was: The conversion of a datetime2 data type to a datetime data type resulted in an out-of-range value In this blog post, we will delve into the causes of this error and explore ways to resolve it. "Constructing a database in the 18th century" - Generated by AI Understanding the error This error typically occurs when there is an attempt to convert a datetime2 value in SQL Server to a datetime value, and the value falls outside the valid range for the datetime data type. datetime : This data type in SQL Server has a range from January 1, 1753, to December 31, 9999, with an accuracy of 3.33 milliseconds. datetime2 : This newer data type, introduced in SQL Server 2008, has a much broader range from January 1, 0001, to December 31, 9999, with an accuracy of 100 nanoseconds.

Automating MassTransit Consumer Registration

When working with MassTransit, registering consumers can become cumbersome if you have many of them. Luckily, MassTransit provides a way to register all your consumers automatically using AddConsumers . This post will guide you through the process of setting up and using AddConsumers to simplify your consumer registration. What is MassTransit? MassTransit is an open-source distributed application framework for .NET, which simplifies the creation and management of message-based systems. It supports various messaging platforms like RabbitMQ, Azure Service Bus, and Amazon SQS. Setting Up MassTransit Before diving into consumer registration, let’s quickly set up a MassTransit project. Step 1: Install MassTransit Packages First, install the necessary MassTransit packages via NuGet. You can use the following command on your favorite command line: dotnet add package MassTransit.RabbitMQ Step 2: Configure MassTransit In your Program.cs or wherever you configure your service

Don’t talk about non-functional requirements, talk about quality attributes

When discussing software development, terms shape our perception and priorities. One such discussion revolves around the terminology used for requirements that go beyond direct functionalities. Traditionally, when talking about architectural needs, I used to call them non-functional requirements (NFRs). However, I switched recently to a  more fitting term—quality attributes—as it may better emphasize their importance and value.    Generated by AI Let me explain why this more than just a semantic change but a strategic enhancement… What are non-functional requirements? Before I explain my reasoning, let me shortly define what non-functional requirements are: Non-functional requirements (NFRs), also known as quality attributes, refer to criteria that judge the operation of a system rather than specific behaviors or functions. These requirements define the overall qualities and characteristics that the system must possess, ensuring it meets user needs and performs efficiently an

Debug your .NET 8 code more efficiently

.NET 8 introduces a lot of debugging improvements. If you take a look for example at the HttpContext , you see that you get a much better debug summary than in .NET 7: .NET 7: .NET 8: But that is not a feature I want to bring under your attention. After recently updating my Visual Studio version, I noticed the following announcement among the list of new Visual Studio features: That is great news! This means that you can debug your .NET 8 applications without a big performance impact on the rest of your code. The only thing we need to do is to disable the Just My Code option in Visual Studio: If we now try to debug a referenced release binary, only the relevant parts are decompiled without impacting the other code: More information Debugging Enhancements in .NET 8 - .NET Blog (microsoft.com)

EF Core - Error CS1503 Argument 2: cannot convert from 'string' to 'System.FormattableString'

I was pair programming with a team member when she got the following compiler error: Error CS1503 Argument 2: cannot convert from 'string' to 'System.FormattableString' The error appeared after trying to compile the following code: The reason becomes apparent when we look at the signature of the SqlQuery method: As you can see the method expects a FormattableString not a string . Why is this? By using a FormattableString EF Core can protect us from SQL injection attacks. When we use this query with parameters(through string interpolation), the supplied parameters values are wrapped in a DbParameter . To get rid of the compiler error, we can do 2 things: 1. We explicitly create  a FormattableString from a regular string: 2. We use string interpolation and add a ‘$’ sign in front of the query string: More information SQL Queries - EF Core | Microsoft Learn RelationalDatabaseFacadeExtensions.SqlQuery<TResult> Method (Microsoft.EntityF

Git–Dubious ownership

Today I had a strange Git error message I never got before. When I tried to execute any action on a local Git repo, it fails with the following error: git status fatal: detected dubious ownership in repository at 'C:/projects/examplerepo' 'C:/projects/examplerepo' is owned by: 'S-1-5-32-544' but the current user is: 'S-1-5-21-1645522239-329068152-682003330-18686' To add an exception for this directory, call: git config --global --add safe.directory C:/projects/examplerepo Image generated by AI What does this error mean and how can we fix it? Let’s find out! This error means that the current user is not the owner of the git repository folder. Before git executes an operation it will check this and if that is not the case it will return the error above. The reason why this check exists is because of security reasons. Git tries to avoid that another user can place files in our git repo folder. You can check the owner of a directory by executing

EF Core - Query splitting

In EF Core when fetching multiple entities in one request it can result in a cartesian product as the same data is loaded multiple times which negatively impacts performance. An example is when I try to load a list of Customers with their ordered products: The resulting query causes a cartesian product as the customer data is duplicated for every ordered product in the result set. EF Core will generate a warning in this case as it detects that the query will load multiple collections. I talked before about the AsSplitQuery method to solve this. It allows Entity Framework to split each related entity into a query on each table: However you can also enable query splitting globally in EF Core and use it as the default. Therefore update your DbContext configuration: More information EF Core–AsSplitQuery() (bartwullems.blogspot.com) Single vs. Split Queries - EF Core | Microsoft Learn

Loading aggregates with EF Core

In Domain-Driven Design (DDD), an aggregate is a cluster of domain objects that are treated as a single unit for the purpose of data changes. The aggregate has a root and a boundary: Aggregate Root : This is a single, specific entity that acts as the primary point of interaction. It guarantees the consistency of changes being made within the aggregate by controlling access to its components. The aggregate root enforces all business rules and invariants within the aggregate boundary. Boundary : The boundary defines what is inside the aggregate and what is not. It includes the aggregate root and other entities or value objects that are controlled by the root. Changes to entities or value objects within the boundary must go through the aggregate root to ensure consistency. An example of an aggregate is an Order (which is the Aggregate root) together with OrderItems (entities inside the Aggregate). The primary function of an aggregate is to ensure data consist

Entity Framework Core– Avoid losing precision

When mapping a decimal type to a database through an ORM like EF Core, it is important to consider the precision. You don't want to lose data or end up with incorrect values because the maximum number of digits differs between the application and database. If you don’t explicitly configure the store type, EF Core will give you a warning to avoid losing precision. Imagine that we have the following Product class with a corresponding configuration: If we try to use this Product class in our application, we get the following warning: warn: SqlServerEventId.DecimalTypeDefaultWarning[30000] (Microsoft.EntityFrameworkCore.Model.Validation)       No store type was specified for the decimal property 'UnitPrice' on entity type 'Product'. This will cause values to be silently truncated if they do not fit in the default precision and scale. Explicitly specify the SQL server column type that can accommodate all the values in 'OnModelCreating' using 'Has

CS0012: The type 'System.Object' is defined in an assembly that is not referenced.

A fter referencing a .NET Standard 2.0 project in a .NET 4.8 ASP.NET MVC project, the project failed ant runtime with the following error message: CS0012: The type 'System.Object' is defined in an assembly that is not referenced. You must add a reference to assembly 'netstandard, Version=2.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'. Whoops! Let me explain how I fixed the problem. The solution – Part I I first tried to add the NETStandard.Library nuget package to the .NET 4.8 project but that didn’t made the error disappear.(Although I come back to this in Part II below). So I removed the nuget package again and instead I did the following: I manually edited the csproj file and added the following reference: I also updated the reference and set Copy Local=true After doing that the error disappeared and the application ran successfully on my local machine. Victory… … or not? After committing the updated project, a colleague conta

Visual Studio–View .NET Counters while debugging

The .NET runtime exposes multiple metrics through the concept of Event Counters. They were originally introduced in .NET Core as a cross-platform alternative to Performance Counters that only worked on a Windows OS. With the recent introduction of OpenTelemetry support in .NET and the System.Diagnostics.Metrics API, there is a clear path forward. But this doesn’t mean that Event Counters are not useful anymore. The tooling and ecosystem around it is evolving to support both Event Counters and System.Diagnostics.Metrics. For example, you can see this in action when using the global dotnet-counters tool. Where I before always used the dotnet-counters tool to monitor the .NET counters, I recently discovered during a debugging session in Visual Studio, that you directly can access this information in the Diagnostics Tools: During a debugging session, go to Diagnostics Tools : Select the .NET Counters option from the Select Tool dropdown if the Counters are not yet enable

.NET Aspire Developers Day is coming up!

If you are part of the .NET community, you certainly have heard about .NET Aspire. It is a new framework/tool/set of patterns from Microsoft that allows you to build observable, cloud ready distributed applications. It’s main goal is to make it easier to build cloud-native applications on top of .NET. Now if you didn’t had the time yet to take a look at .NET Aspire and want to learn what all the fuzz is about, I have some good news for you. On July 23, Microsoft will host the .NET Aspire Developers Day. This livestream event will be a full day of sessions with one common goal: To show you how easy it is to harness the power of .NET Aspire, why it’s essential for modern development, and how you can leverage a vibrant community for support and innovation. If you like to join, subscribe here for the event. Hope to see you all(virtually) there! In preparation of the event, you can watch the recording of the Let’s Learn .NET Aspire beginner series: More information .NET A

Visual Studio - FastUpToDate warning

While working on updating a (very old) existing .NET application, I noticed the following message in the build output: FastUpToDate: This project has enabled build acceleration, but not all referenced projects produce a reference assembly. Ensure projects producing the following outputs have the 'ProduceReferenceAssembly' MSBuild property set to 'true': 'C:\projects\Example.Data\bin\Debug\netstandard2.0\Example.Data.dll'. See https://aka.ms/vs-build-acceleration for more information. (Example.Business) Build acceleration; that was a topic I had talked about before . It is a feature of Visual Studio that reduces the time required to build projects(as you already could have guessed). Because the mentioned projects where targeting .NET Standard 2.0, some extra work is required to make build acceleration work. Before .NET 5 (including .NET Framework and .NET Standard), you should set ProduceReferenceAssembly to true in order to speed incremental builds. S

SQL Server–Does a ‘LIKE’ query benefits from having an index?

Last week I was interviewing a possible new colleague for our team. During the conversation we were talking about database optimization techniques and of course indexing was one of the items on the list. While discussing this topic, the candidate made the following statement: An index doesn’t help when using a LIKE statement in your query. This was not in line with my idea. But he was so convinced that I decided to doublecheck. Hence this blog post… Does a ‘LIKE’ query benefits from having an index? Short answer: YES! The somewhat longer answer: Yes, a LIKE statement can benefit from an index in SQL Server, but its effectiveness depends on how the LIKE pattern is constructed. Let’s explain this with a small example. We created a Products table and an index on the ProductName column. Let’s now try multiple LIKE statement variations: Suffix Wildcard (Efficient Index Usage) This query will benefit from the index because the wildcard is at the end: Prefix Wil

Understanding Pure Domain Modelling: Bridging the Gap Between Existing Systems and the Real Domain

Domain modelling plays a crucial role in the way that I design systems to reflect the business's needs and processes. However, I experienced there is often a disconnect between the idealistic view of domain modelling and the practical reality faced by domain experts. One key issue is that domain experts tend to start from their existing systems rather than describing the 'real' domain. In this post I want to talk about pure domain modelling as a way to overcome the bias that domain experts have when explaining their needs. The domain expert bias When domain experts contribute to domain modelling, they frequently start from the perspective of the existing systems they are familiar with. These systems, whether they are legacy products, databases, or other technological solutions, shape their understanding and descriptions of the domain. While this approach has its advantages, it also introduces several challenges: Bias Towards Existing Systems: Domain experts may de