Skip to main content

Posts

Showing posts from 2024

Visual Studio–Share your settings

In VSCode you can share your settings through profiles . This allows you to easily apply your UI layout, settings and extensions to multiple VSCode instances. A similar thing is possible in Visual Studio. Settings can be exported through the Import and Export Settings Wizard: Go to Tools –> Import and Export Settings Choose Export selected environment settings and click on Next > Now you can choose which settings should be exported. Check or uncheck the settings you want to export and click on Next > Specify where you want to store your .vssettingsfile file and click on Finish You can now close the wizard. Remark: When you sign in to Visual Studio on multiple computers using the same personalization account, your settings can be synchronized across the computers. Although the .vssettings file allows you to share a lot of configuration settings, it cannot be used to share the installed features and extensions. However this is possible through an

Property based testing in C#–CsCheck

Almost a year ago I wrote a series of blog posts on how to use property-based tests in C#. Part 1 – Introduction Part 2 – An example Part 3 – Finding edge cases Part 4 – Writing your own generators Part 5 – Locking input In this series I used FsC https://fscheck.github.io/FsCheck/ heck as the library of my choice. Although originally created for F#, it also works for C# as I have demonstrated. However as it was originally created for F#, it sometimes feels strange when using FsCheck in C#. If you prefer a more idiomatic alternative, you can have a look at CsCheck , also inspired by QuickCheck but specifically created for C#. CsCheck offers no specific integration but can be used with any testing framework(XUnit, NUnit, MSTest, …). Here is a small example: CsCheck does it really well in the shrinking challenge and offers support for multiple types of tests including concurrency testing . This is a feature I really like as concurrency related issues

Share a private key without using passwords

If you follow security best practices, you are not re-using the same password for multiple purposes. As a consequence you end up with a long list of passwords that you need to secure and manage. Although the use of a password vault certainly improved the experience, I still try to avoid the usage of passwords as much as possible. Today I want to share a ‘trick’ I discovered that allows you to share(export/import) a PFX file without using passwords. No clue what a pfx is? Let me explain that first… A PFX file, also known as a PKCS#12 file , is a binary format used to store certificates and their associated private keys . It combines 2 parts: A certificate part : A certificate is a digital document that contains information about an entity (such as a person, organization, or server). It is used for authentication, encryption, and secure communication. Certificates are issued by a Certificate Authority (CA) . A private key part : The private key is a cryptographic key

The importance of the ubiquitous language

We all know that the hardest thing in software development is naming things . Domain Driven Design tries to tackle this by focusing on the 'ubiquitous language'. The "ubiquitous language" refers to a shared language that is used by all team members, including domain experts, developers, and stakeholders, to discuss the domain and the software being developed. This language is designed to bridge the communication gap between technical and non-technical team members, ensuring that everyone has a clear understanding of the domain concepts and requirements. The ubiquitous language consists of domain-specific terms and concepts that are defined collaboratively and consistently used across all artifacts of the software development process, including code, documentation, and discussions. By using a common language, DDD aims to reduce misunderstandings and ambiguities, leading to more effective collaboration and better software designs. The best way to emphasize the imp

Azure Static Web App–Authorization

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV – Password protect your environments Part V – Traffic splitting Part VI – Authentication using pre-configured providers Part VII – Application configuration using staticwebapp.config.json Part VIII – API Configuration Part IX – Injecting snippets Part X – Custom authentication Part XI(this post) – Authorization If we talk about authentication, we also have to talk about authorization. Authorization is role based and every user gets at least one role for free: the “anonymous” role. Once you are authenticated, a second role is assigned; the “authenticated” role. You can see this by browsing to the .auth/me endpoint after authenticating. You’ll get a response back similar to this

Implement the mediator pattern using MassTransit

The mediator pattern is a behavioral design pattern used in software engineering. It defines an object that encapsulates how a set of objects interact .  In object-oriented programming, programs often consist of many classes. As more classes are added to a program, especially during maintenance or refactoring, the problem of communication between these classes becomes complex. Direct communication between objects can lead to tight coupling, making the program harder to read, maintain, and change. The mediator pattern introduces a mediator object that acts as an intermediary between interacting objects. Instead of direct communication, objects now communicate through the mediator. This reduces dependencies between objects and promotes loose coupling. A popular way to implement the mediator pattern in .NET is through the popular MediatR library . But if you are already using Masstransit ,  there is no need to introduce an extra dependency as Masstransit has built-in support for

Don’t be a feature team

I saw the following quote passing when watching a presentation by Nick Tune : The best single source for innovation is your engineers (because  they’re working with the enabling technology every day, so they’re in the best position to see what’s just now possible). This quote comes from Marty Cagan who makes the distinction between product teams and feature teams. What is the difference between the 2? Empowerment! Marty Cagan explains empowerment like this in one of his articles : Empowerment of an engineer means that you provide the engineers with the problem to solve, and the strategic context, and they are able to leverage technology to figure out the best solution to the problem. An easy way to tell whether you have empowered engineers or not, is if the first time your engineers see a product idea is at sprint planning, you are clearly a feature team, and your engineers are not empowered in any meaningful sense. The reality is that most teams freedom is limi

Azure Static Web App–Custom authentication

As a follow-up on the presentation I did at  CloudBrew  about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV – Password protect your environments Part V – Traffic splitting Part VI – Authentication using pre-configured providers Part VII – Application configuration using staticwebapp.config.json Part VIII – API Configuration Part IX – Injecting snippets Part X(this post) – Custom authentication After getting side tracked by talking about configuration, it is now time to focus back on the authentication and authorization part.  I already explained that when using Static Web Apps you get 2 authentication providers out-of-the-box: GitHub Azure Active Directory (Microsoft Entra ID) Today I want to show you how to use the OIDC provider of your choice through custom authentication. So if you want to specify your own Azure AD

GraphQL OWASP Cheat Sheet

I’m a big fan of GraphQL but as with every technology it comes with it's own set of (security) challenges. To properly secure your GraphQL API, I can recommend to check the GraphQL Cheat Sheet . It handles common attack vectors and best practices to avoid them.   If you have never heard about the OWASP Cheat Sheet Series , also have a look at all the other sheets in the series. A must read for every developer!

.NET 8–It’s time to get rid of these flaky tests!

There is only one thing worse than having no tests and that is having flaky tests. You know, tests that sometimes fail and sometimes succeed when the code itself didn't change. One of the biggest reasons this can happen is when your tests have time related logic. My tests succeed on Monday but not during the weekend. Sidenote: there is a cool Azure DevOps feature specifically to handle and fix flaky tests. Anyway that is not the topic of this post. In .NET 8 Microsoft tried to fix this by introducing a new time abstraction API. Instead of directly calling DateTime.Now or DateTime.UtcNow inside your code, you can now use the TimeProvider class and ITimer interface. Together they offer time abstraction functionality, facilitating the simulation of time in testing scenarios. Remark: Microsoft was so nice to backport these APIs and made them available in a separate NuGet package Microsoft.Bcl.TimeProvider . So now we can update our code to use the TimeProvider class i

RabbitMQ–Using Alternate Exchanges to avoid loosing messages

A few days ago I blogged about a situation we had where some messages send to RabbitMQ got lost. I showed a possible solution when using MassTransit. We further investigated the issue and a colleague(thanks Stijn!) suggested another solution by using a specific RabbitMQ feature: Alternate Exchanges . The documentation explains the feature like this: It is sometimes desirable to let clients handle messages that an exchange was unable to route (i.e. either because there were no bound queues or no matching bindings). Typical examples of this are detecting when clients accidentally or maliciously publish messages that cannot be routed "or else" routing semantics where some messages are handled specially and the rest by a generic handler Alternate Exchange ("AE") is a feature that addresses these use cases. Whenever an exchange with a configured AE cannot route a message to any queue, it publishes the message to the specified AE instead. If that AE doe

Azure DevOps–Set default access level

Something that I always have to explain when someone uses Azure DevOps for the first time is that there are 2 levels that needs to be configured to give someone access to Azure DevOps. The first level is the security level. By adding a user to a specific security group at the project level, the user is able to access specific features and execute specific actions However there is also a second level, the access level. This level is related to the acquired license and also defines which features are available to you. So only if the 2 levels align, you can use a specific feature inside Azure DevOps A typical situation: A project administrator adds a new user to a team inside Azure DevOps. By adding a user to a team, he is added to the Contributor group and should be able to access most features at the project level. However because he is new in Azure DevOps, the user gets a default access level assigned. If you didn’t change this default, this will be the Stakeholder access le

Azure DevOps Server–Change attachment size

If you are using Azure DevOps in Azure, attachments added to a workitem are limited to 60MB in size, a value you cannot change. On premise, when using Azure DevOps Server, the default attachment size is limited to 4MB. However you can increase this value to up to 2GB. Let me show you how to get this done… Log on to the application tier of your Azure DevOps server Remark: You have to log in with an account that is a member of the Team Foundation Administrators group Browse to the following URL to open the ConfigurationSettingsService: http://localhost:8080/tfs/DefaultCollection/WorkItemTracking/v1.0/ConfigurationSettingsService.asmx Choose the SetMaxAttachmentSize operation from the list of available operations: Specify a maxSize value in bytes and click on Invoke to update the configuration setting: You can validate if the value was changed correctly by going to the GetMaxAttachmentSize operation and invoking it. This will return

F# - The essentials

Long time readers of my blog know that my programming language of choice is not C# but F#. Although I don't have a lot of opportunities to use it in my day to day job, I always like to return to functional programming land whenever I can for small (side) projects. If you never have tried F# yourself, this is a good moment to give it a try. The people from Amplifying F# released a FREE online course based on Ian Russell’s Essential F# book. Every week a new aspect of the language is introduced and each lesson takes only 10-15 minutes. So no excuses, just give it a try. The lessons just got started, so this is the right time to still join the course. It will make you a better programmer in general and will positively impact the way you write your C# code as well.

.NET 8–Refresh memory limit

With more and more workloads running in the cloud, optimizing the resource consumption becomes an important feature of every application. Being able to dynamically scale down the memory limit can help us reduce costs when demand decreases. However before .NET 8, when a service tried to decrease the memory limit on the fly, it could fail as the .NET garbage collector was unaware of this change and would still allocate more memory. No longer in .NET 8!  A new API was introduced that allows to adjust the memory limit on the fly: This will instruct the garbage collector to reconfigure itself by detecting the various memory limits on the system. Calling this API can result in an InvalidOperationException when the newly set limit is lower than what's already committed. Remark: For smaller workloads it can also be useful to switch from Server GC to Workstation GC, which optimizes for lower memory usage. The switch can be done by adding this flag to your csproj file: More inf

Azure Static Web App–Inject snippets

As a follow-up on the presentation I did at  CloudBrew  about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV – Password protect your environments Part V – Traffic splitting Part VI – Authentication using pre-configured providers Part VII – Application configuration using staticwebapp.config.json Part VIII – API Configuration Part IX(this post) – Injecting snippets Before I continue with authentication and authorization, I want to spend one last post on configuration and more specifically on snippets . Snippets is custom code that can be injected into the head or body elements at runtime. This is useful to inject for example analytics scripts (Google Analytics, Application Insights, …) or global UI elements. Adding a snippet Open your Static Web App in the Azure Portal and go to Configuration : Select the Snippets tab an

Azure Static Web App–API configuration

As a follow-up on the presentation I did at  CloudBrew  about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV – Password protect your environments Part V – Traffic splitting Part VI – Authentication using pre-configured providers Part VII – Application configuration using staticwebapp.config.json Part VIII(this post) – API Configuration In the last post in this series, I talked about the application configuration. But an Azure Static Web App can be a combination of an application part and an API part. Today we’ll have a look on how to configure the API part. Storing values API configuration settings can be set through the Azure Portal or Azure CLI and are stored in the backend of your Azure Static Web App in an encrypted format. Azure Portal To set a configuration value through the Azure Portal, go to Configuration : Se

GetHashCode() in .NET Core

If you ever had to implement the Equals() method to compare two instances of a type in .NET, you had to implement the GetHashCode() method too. The GetHashCode method returns a numeric value which is used to identify an object during equality testing. It can also serve as an index for an object in a collection. The purpose of the method is to create a key for hashtable. It is by design useful for only one thing: putting an object in a hash table. It is faster to use the return value of GetHashCode to determine whether two objects are equal than to call the default implementation of Equals on the object type. In other words, GetHashCode is used to generate a unique identifier for an object that can be used to compare it with other objects. It is used internally by the .NET framework for quick comparisons. If you had to implement the GetHashCode method, there were some rules that should be followed which could make it quite a challenge to implement it correctly:

MassTransit–Avoid losing messages

At one of my clients we had a situation where messages got lost after sending them to RabbitMQ. This is quite bad as the whole point of having a message based solution was to improve the reliability of our solutions(even when of the involved systems is offline or unavailable). In this post I want to explain what got wrong and how we introduced a solution to prevent this from happening in the future. To understand the problem I first have to explain the concept of an exchange. In RabbitMQ, exchanges are message routing agents that are responsible for routing messages to different queues with the help of header attributes, bindings, and routing keys. A producer never sends a message directly to a queue. Instead, it uses an exchange as a routing mediator. Therefore, the exchange decides if the message goes to one queue, to multiple queues, or is simply discarded. Let me emphasize one sentence here: In RabbitMQ , a producer never sends a message directly to a queue Only que

Azure Static Web App– Application configuration using staticwebapp.config.json

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV – Password protect your environments Part V – Traffic splitting Part VI – Authentication using pre-configured providers Part VII(this post) – Application configuration using staticwebapp.config.json Before I continue with the authentication and authorization part, I want to take a side step into configuration. When talking about configuring a static web app, we have to make a difference between: Application configuration : This allows us to configure the application behavior and features and is managed through  the staticwebapp.config.json file. Use this file to define route and security rules, custom headers, and networking settings. Build configuration : Tweak the build p

MassTransit–.NET 8 upgrade warnings

After upgrading to MassTransit 8.1, I got a list of warnings. In this post I'll walk you through the list of warnings and how I fixed them. Let's get started... ConsumerDefinition warning The first warning I got was the following: 'ConsumerDefinition<T>.ConfigureConsumer(IReceiveEndpointConfigurator, IConsumerConfigurator<T>)' is obsolete: 'Use the IRegistrationContext overload instead. Visit https://masstransit.io/obsolete for details.' That is an easy one to fix, I had to rewrite my original ConsumerDefinition: To a version that uses the IRegistrationContext as one of the parameters: InMemory Outbox warning A second warning was related to the usage of the InMemoryOutbox: 'InMemoryOutboxConfigurationExtensions.UseInMemoryOutbox(IConsumePipeConfigurator, Action<IOutboxConfigurator>)' is obsolete: 'Use the IRegistrationContext overload instead. Visit https://masstransit.io/obsolete for details.' Here is the

MassTransit–.NET 8 upgrade errors - No service for type 'MassTransit.Saga.ISagaRepositoryContextFactory`1[MassTransit.JobTypeSaga]' has been registered.

While upgrading an application to .NET 8 (and upgrading to the latest MassTransit version along the way), I stumbled over some Masstransit specific issues after the upgrade. The first error I got after upgrading my application to .NET 8 was the following: No service for type 'MassTransit.Saga.ISagaRepositoryContextFactory`1[MassTransit.JobTypeSaga]' has been registered. at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService(IServiceProvider provider, Type serviceType)    at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService[T](IServiceProvider provider)    at MassTransit.DependencyInjection.DependencyInjectionSagaRepositoryContextFactory`1.Send[T](ConsumeContext`1 context, Func`3 send) in /_/src/MassTransit/DependencyInjection/DependencyInjection/DependencyInjectionSagaRepositoryContextFactory.cs:line 85    at MassTransit.DependencyInjection.DependencyInjectionSagaRepositoryContextFac

Azure Pipelines error - User lacks permission to complete this action. You need to have 'ReadPackages'.

I blogged before about the following Azure Pipelines error wen trying to do a NuGet restore: ##[error]The nuget command failed with exit code(1) and error(Unable to load the service index for source https://tfs.server.be/tfs/DefaultCollection/_packaging/797f899f-9ad1-4158-93bc-8f3293cf4a59/nuget/v3/index.json. Response status code does not indicate success: 403 (Forbidden - User 'Build\534a066a-3992-4851-a816-b189836bee69' lacks permission to complete this action. You need to have 'ReadPackages'. (DevOps Activity ID: 8D53BF1D-E45E-49C4-879F-6CBD8635D1CE)). Although the solution I mentioned in the original blogpost should work, I found a different solution that should also do the trick. Here are the steps: Go to the Azure DevOps project that contains the failed pipeline in Azure DevOps. Go to Artifacts Select the feed that causes the problem from the drop down (if not selected by default) Click the Feed Setting gear on the top right corner.

Batching work in SQL Server

In one of our ASP.NET Core applications, I added a new feature to cleanup old data. My implementation was simple and used a BackgroundService to run a cleanup script on periodic intervals: All worked fine during development and testing, but when I deployed it to production it brought the whole application to a halt. What was happening? First, as this was the first time the script was run on production, there was a lot of old data. So while the query executed and completed quite fast on other environments, on production it impacted millions of rows. What made the problem even worse is that the table that should be cleaned up contained a large amount of binary data. This made the transaction log grow in size and further increased the query duration. My first attempt to improve the performance of this query was to delete the data based on the primary key. A suggestion I found here: How to Delete Large Amounts of Data – SQLServerCentral However the impact of this change was m

.NET 8– System.Text.Json serializer error

I really like the System.Text.Json source generator as a way to optimize your API performance by reducing the serialization/deserialization cost. However after upgrading to .NET 8 I not only got some warnings , one of my applications failed during execution with the following error message: System.NotSupportedException: JsonTypeInfo metadata for type 'System.Collections.Generic.List`1[OrderDto]' was not provided by TypeInfoResolver of type ‘JsonContext'. If using source generation, ensure that all root types passed to the serializer have been annotated with 'JsonSerializableAttribute', along with any types that might be serialized polymorphically.    at System.Text.Json.ThrowHelper.ThrowNotSupportedException_NoMetadataForType(Type type, IJsonTypeInfoResolver resolver)    at System.Text.Json.JsonSerializerOptions.GetTypeInfoInternal(Type type, Boolean ensureConfigured, Nullable`1 ensureNotNull, Boolean resolveIfMutable, Boolean fallBackToNearestAncestorType)