Skip to main content

Posts

Showing posts from October, 2024

Applying the ‘Wisdom of the crowd’ effect in software development

I’m currently reading Noise: A Flaw in Human Judgment , the latest book by Daniel Kahneman wo also wrote Thinking, Fast and Slow . I’m only halfway in the book but in one chapter the authors talk about an experiment done by 2 researchers, Edward Vul and Harold Pashler where they gave a person a specific question not once but twice . The hypothesis was that the average of 2 answers was always closer to the truth than each answer independently. And indeed they were right. One knows more than one Turns out that this is related to the wisdom-of-crowds-effect ; if you take the average of a number of independent answers from different people it typically leads to a more correct answer. I never heard about this effect before, but it turns out that I’m applying this principle for a long time based on something I discovered in the Righting Software book by Juval Löwy, the broadband estimation technique . This technique allows you to estimate the implementation effort for a c...

.NET 8 upgrade - error NETSDK1045: The current .NET SDK does not support targeting .NET 8.0.

A colleague asked me to create a small fix on an existing library. I implemented the fix and decided to take the occasion to upgrade to .NET 8 as well. How hard can it be… Turns out that this was harder than I thought. After upgrading the target framework moniker to .NET 8, the build started to fail with the following (cryptic) error message: C:\Program Files\dotnet\sdk\6.0.407\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.TargetFrameworkInference.targets(144,5): error NETSDK1045: The current .NET SDK does not support targeting .NET 8.0.  Either target .NET 6.0 or lower, or use a version of the .NET SDK that supports .NET 8.0. .NET 8 was certainly installed on this machine, so that could not be the issue: Then I took a second look at the error message and I noticed something, the compiler was using the .NET 6 SDK although the application itself was configured to use .NET 8. Of course! Now I remembered. In this project I was using a global.json file. Through this file y...

SQL Server - Use Table Valued parameters to construct an IN statement

A colleague created a stored procedure that returns some data from a specific table. Nothing special would you think and you are right. The only reason we were using a stored procedure here is that we had a very specific requirement that every attempt to read data from this specific table should be logged. Here is a simplified version of the stored procedure he created: What I want to talk about in this post is the usage of a (comma separated) string parameter that is used to construct the filter criteria for the query. Remark: This version of the stored procedure is already better than the original version that was using dynamic SQL to construct an IN clause: Using Table-Valued Parameters We can further improve the procedure above by using a table valued parameter. To use table-valued parameters instead of comma-separated strings in your stored procedure, you can follow these steps: Step 1: Create a Table-Valued Parameter Type First, you need to create a table-valued...

EF Core–Read and write models

Today I was working with a team that was implementing a CQRS architecture. CQRS (Command Query Responsibility Segregation) is a design pattern that separates the responsibilities of reading and writing data into distinct models. The idea is to use one model to handle commands (which modify data) and another model to handle queries (which retrieve data). This separation allows for better scalability, performance optimization, and flexibility, as the read and write operations can be independently optimized or scaled based on the specific needs of the system. After creating a read model for a specific table in the database, EF core started to complain and returned the following error message: System.InvalidOperationException : Cannot use table 'Categories' for entity type 'CategoryReadModel since it is being used for entity type 'Category' and potentially other entity types, but there is no linking relationship. Add a foreign key to 'CategoryReadModel' on the...

Architecting Your Team Setup: Aligning Teams with Software Design (and Vice Versa)

As software architects, we tend to focus heavily on the design of the systems we build—how the various components interact, the data flow, and the technology choices. But architecture doesn’t exist in a vacuum. One often overlooked element is how the structure of our teams can (and should) align with the architecture itself. The relationship between team setup and software design is symbiotic: your team’s structure influences the system’s architecture, and the architecture shapes how teams need to work together. Getting this alignment right can be the key to efficiency, scalability, and long-term success. Why Team Setup Matters in Software Architecture There’s an adage known as Conway’s Law , which states:  Any organization that designs a system (defined broadly) will produce a design whose structure is a copy of the organization's communication structure. In simple terms, the way your teams are structured will be reflected in your system’s architecture. If your teams d...

Implementing an OAuth client credentials flow with ADFS–Part 4–Understanding and fixing returned error codes

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. Part 1 - ADFS configuration Part 2 – Application configuration Part 3 – Debugging the flow Part 4 (this post) – Understanding and fixing returned error codes Last post we updated our configuration so we could see any errors returned and are able to debug the authentication flow. In the first 2 posts I showed you everything that was needed to get up and running. The reality was that it took some trial and error to get everything up and running. In this post I share all the errors I got along the way and how I fixed them. IDX10204: Unabl...

Implementing an OAuth client credentials flow with ADFS–Part 3–Debugging the flow

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. Part 1 - ADFS configuration Part 2 – Application configuration Part 3 (this post) – Debugging the flow In the first 2 posts I showed you the happy path. So if you did everything exactly as I showed, you should end up with a working Client Credentials flow in ADFS. Unfortunately there are a lot of small details that matter, and if you make one mistake you’ll end with a wide range of possible errors. In today’s post, I focus on the preparation work to help us debug the process and better understand what is going on. Updating your OAuth Confi...

Implementing an OAuth client credentials flow with ADFS–Part 2–Application configuration

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. Part 1 - ADFS configuration Part 2 (this post) – Application configuration After doing all the configuration work in ADFS, I’ll focus today on the necessary work that needs to be done on the application side. Configuring the API We’ll start by configuring the API part. First create a new ASP.NET Core API project dotnet new webapi --use-controllers -o ExampleApi Add the ‘ Microsoft.AspNetCore.Authentication.JwtBearer ’ package to your project: dotnet add package Microsoft.AspNetCore.Authentication.JwtBearer Add the auth...

Implementing an OAuth client credentials flow with ADFS–Part 1 - ADFS configuration

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. In todays post, I focus on the ADFS configuration. To make it not unnecessary complex, I’ll show the steps using one of the simplest OAuth flows; the Client Credentials flow . OAuth Client Credentials flow The OAuth Client Credentials flow is an authentication method used primarily for machine-to-machine (M2M) communication. In this flow, an application (the "client") requests an access token directly from an OAuth 2.0 authorization server using its own credentials, without involving a user. This access token allows the client to acces...

JWT decoder in Visual Studio

So far I always used JWT.io to decode my JWT tokens. But today I discovered that I don't have to leave my Visual Studio IDE anymore. But before I show you this feature, let’s just briefly summarize what JWT tokens are. JWT what? JWT (JSON Web Token) is a compact, URL-safe token format used for securely transmitting information between parties. It is commonly used in authentication and authorization scenarios, especially in web and mobile applications. A JWT consists of three parts: Header : Contains metadata about the token, such as the signing algorithm used (e.g., HS256 or RS256). Payload : Contains the claims, which are statements about the user or other data (e.g., user ID, roles). This data is not encrypted, so it should not include sensitive information. Signature : A cryptographic signature that ensures the token has not been tampered with. It is created by encoding the header and payload, then signing them using a secret key or public/private key pair. ...

Running a fully local AI Code Assistant with Continue–Part 6–Troubleshooting

In a previous posted I introduced you to Continue in combination with Ollama, as a way to run a fully local AI Code Assistant. Remark: This post is part of a bigger series. Here are the other related posts: Part 1 – Introduction Part 2 -  Configuration Part 3 – Editing and Actions Part 4  -  Learning from your codebase Part 5 – Read your documentation Part 6 (this post) – Troubleshooting Although Continue really looks promising, I stumbled on some hurdles along the way. Here are some tips in case you encounter issues: Tip 1 - Check the Continue logs My first tip is to always check the logs. Continue provides good logging inside the IDE, So go to the output tab and switch to the Continue source to get the generated output: Tip 2 – Check the Continue LLM logs Next to the output of Continue itself, you can find all the LLM specific logs in the Continue LLM Prompt/Conversation output. So don’t forget to check that output as well: Tip 3 ...

Running a fully local AI Code Assistant with Continue–Part 5–Read your documentation

In a previous posted I introduced you to Continue in combination with Ollama, as a way to run a fully local AI Code Assistant. Remark: This post is part of a bigger series. Here are the other related posts: Part 1 – Introduction Part 2 -  Configuration Part 3 – Editing and Actions Part 4  -  Learning from your codebase Part 5 (this post) – Read your documentation Today I want to continue by having a look at how Continue can scrape your documentation website and make the content accessible inside your IDE The @docs context provider To use this feature you need to use the @docs context provider: Once you type @docs you already get a long list of available documentation: This is because Continue offers out-of-the-box a selection of pre-indexed documentation sites. (You can find the full list here ) If you now ask a question, the indexed documentation is used to answer your question: You can see the context used by expanding the context items sectio...

Hangfire–Change polling interval

In modern .NET applications, background processing plays a crucial role in handling tasks that don't need to be executed immediately, such as sending emails, processing large datasets, or scheduling recurring jobs. A library I like to use to manage background tasks in .NET is Hangfire , thanks to its simplicity, flexibility, and wide range of storage options. As these background tasks can be scheduled in the future, some kind of storage needs to be configured where Hangfire keeps all the information related to background job processing. Out of the box multiple storage systems are supported, one of them being SQL Server . Although using SQL Server can be a convenient option to use as the main storage for your Hangfire, it comes with one big disadvantage:  Polling is used to fetch new jobs. This has 2 consequences: It increases the load on your SQL server instance The time to pickup a new job and execute it  is signifcantly longer(here is a comparison with Redis f...

Microsoft Student Innovator Series

If you are a student and enthusiastic about AI and Cloud in general and Microsoft technologies specifically, than I would encourage you to check out the Microsoft Innovator Series . This series offers a range of events that can help you improve your technical skills and are a good headstart for the Imagine Cup , Microsoft's premier global technology startup competition for student founders. There are still some events planned between now and the end of this year, so certainly have a look!   More information Event Series | Microsoft Reactor A Journey of Innovation and Inspiration with Microsoft Imagine Cup 2025 - Microsoft Community Hub

Semantic Kernel–Plugin KeyNotFoundException

Plugins are a key element when building agents in Semantic Kernel. They allow you to extend the capabilities of your Large Language Model with extra functions. This post is not about writing your own plugin but Just as a reminder, registering a plugin can be done like this: Here is the code I used to try to invoke the TimePlugin in a prompt: However when I did this the following error was returned: System.Collections.Generic.KeyNotFoundException: 'The plugin collection does not contain a plugin and/or function with the specified names. Plugin name – 'time', function name - 'Date'.' The problem is caused because I had configured an alias name for the plugin but I was using the plugin name in the prompt. I have 2 ways to fix the issue above. Either I update the prompt to point to the configured alias: Or I remove the alias from my configuration: Hope that helps… More information Plugins in Semantic Kernel | Microsoft Learn

Web Deploy error - Source does not support parameter called 'IIS Web Application Name'.

At one of my customers, everything is still on premise hosted on multiple IIS web servers. To deploy web applications, we are using Web Deploy . This works quite nicely and allows us to deploy web application in an automated way. Last week, a colleague contacted me after configuring the deployment pipeline in Azure DevOps. When the pipeline tried to deploy the application, it failed with the following error message: "System.Exception: Error: Source does not support parameter called 'IIS Web Application Name'. Must be one of (Environment)" Here is a more complete build log to get some extra context: Starting deployment of IIS Web Deploy Package : \DevDrop\BOSS.Intern.Web.zip">\DevDrop\BOSS.Intern.Web.zip">\DevDrop\BOSS.Intern.Web.zip">\DevDrop\BOSS.Intern.Web.zip">\\<servername>\DevDrop\BOSS.Intern.Web.zip Performing deployment in parallel on all the machines. Deployment started for machine: <servername> with port...

MassTransit ConsumerDefinitions vs EndpointConfiguration - Understanding the Differences

In message-driven systems, configuring consumers correctly is key to achieving maintainable, scalable, and flexible systems. In MassTransit, two common approaches to achieve this are using ConsumerDefinitions and Endpoint configuration . While both serve the purpose of defining how consumers work within the system, they differ in terms of flexibility, separation of concerns, and implementation details. In this post, we’ll explore the differences and best practices for using them. MassTransit Consumers: The Basics Before diving into the comparison, let’s briefly cover what a consumer is in MassTransit. A consumer is a class responsible for handling incoming messages in a message broker (such as RabbitMQ or Azure Service Bus). Consumers are central to the processing pipeline and play a key role in event-driven architectures. Here is a simple example: Consumer Configuration The question is now how will this consumer be wired to the underlying transport mechanism. As mentioned...

Azure Monitor Log Analytics–Identify high memory usage

Last week we had a production issue at one of my customers where a server went offline due to high memory usage. So far the bad news. The good news is that we had Azure Application Insights monitoring in place, so we could easily validate that the high memory usage  was causing the issue as our Application Insight logs showed a long list of OutOfMemoryException s. However as a separate Application Insights instance was used per application, we couldn’t easily pinpoint which application was the main culprit. Remark: Unfortunately it isn’t  possible to show multiple resources on the Metrics tab, so that is not an option(you can upvote the feature if you like it): I could go through every Application Insights resource one by one, but that wouldn’t be very efficient. Therefore I decided to turn to KQL and write a query on top of the Log Analytics workspace where all the data was consolidated. Here is the query I used in the end: And here is how the result looked like wh...

Semantic Kernel–Giving the new Ollama connector a try

As Semantic Kernel could work with any OpenAI compatible endpoint, and Ollama exposes it language models through an OpenAI compatible API, combining the 2 was always possible. However not all features of Ollama were accessible through Semantic Kernel. With the recent release of a dedicated Ollama connector for Semantic Kernel, we can start using some of the more advanced Semantic Kernel features directly targetting Ollama deployed models. The new connector is using Ollama Sharp(I talked about it in this post ) so you can directly access the library if needed. Giving the new connector a try… Create a new Console application and add the Microsoft.SemanticKernel.Connectors.Ollama NuGet package: dotnet add package Microsoft.SemanticKernel.Connectors.Ollama --version 1.21.1-alpha Now instead of creating a Semantic Kernel instance, we can directly create an OllamaChatCompletionService instance: The remaining part of the code remains the same as with the default ...