Skip to main content

Posts

Implementing an OAuth client credentials flow with ADFS–Part 3–Debugging the flow

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. Part 1 - ADFS configuration Part 2 – Application configuration Part 3 (this post) – Debugging the flow In the first 2 posts I showed you the happy path. So if you did everything exactly as I showed, you should end up with a working Client Credentials flow in ADFS. Unfortunately there are a lot of small details that matter, and if you make one mistake you’ll end with a wide range of possible errors. In today’s post, I focus on the preparation work to help us debug the process and better understand what is going on. Updating your OAuth Confi
Recent posts

Implementing an OAuth client credentials flow with ADFS–Part 2–Application configuration

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. Part 1 - ADFS configuration Part 2 (this post) – Application configuration After doing all the configuration work in ADFS, I’ll focus today on the necessary work that needs to be done on the application side. Configuring the API We’ll start by configuring the API part. First create a new ASP.NET Core API project dotnet new webapi --use-controllers -o ExampleApi Add the ‘ Microsoft.AspNetCore.Authentication.JwtBearer ’ package to your project: dotnet add package Microsoft.AspNetCore.Authentication.JwtBearer Add the auth

Implementing an OAuth client credentials flow with ADFS–Part 1 - ADFS configuration

It looked like most of the world has made the switch to Microsoft Entra(Azure Active Directory). However one of my clients is still using ADFS. Unfortunately there isn't much information left on how to get an OAuth flow up and running in ADFS. Most of the links I found point to documentation that no longer exists. So therefore this short blog series to show you end-to-end how to get an OAuth Client Credentials flow configured in ADFS. In todays post, I focus on the ADFS configuration. To make it not unnecessary complex, I’ll show the steps using one of the simplest OAuth flows; the Client Credentials flow . OAuth Client Credentials flow The OAuth Client Credentials flow is an authentication method used primarily for machine-to-machine (M2M) communication. In this flow, an application (the "client") requests an access token directly from an OAuth 2.0 authorization server using its own credentials, without involving a user. This access token allows the client to acces

JWT decoder in Visual Studio

So far I always used JWT.io to decode my JWT tokens. But today I discovered that I don't have to leave my Visual Studio IDE anymore. But before I show you this feature, let’s just briefly summarize what JWT tokens are. JWT what? JWT (JSON Web Token) is a compact, URL-safe token format used for securely transmitting information between parties. It is commonly used in authentication and authorization scenarios, especially in web and mobile applications. A JWT consists of three parts: Header : Contains metadata about the token, such as the signing algorithm used (e.g., HS256 or RS256). Payload : Contains the claims, which are statements about the user or other data (e.g., user ID, roles). This data is not encrypted, so it should not include sensitive information. Signature : A cryptographic signature that ensures the token has not been tampered with. It is created by encoding the header and payload, then signing them using a secret key or public/private key pair.

Running a fully local AI Code Assistant with Continue–Part 6–Troubleshooting

In a previous posted I introduced you to Continue in combination with Ollama, as a way to run a fully local AI Code Assistant. Remark: This post is part of a bigger series. Here are the other related posts: Part 1 – Introduction Part 2 -  Configuration Part 3 – Editing and Actions Part 4  -  Learning from your codebase Part 5 – Read your documentation Part 6 (this post) – Troubleshooting Although Continue really looks promising, I stumbled on some hurdles along the way. Here are some tips in case you encounter issues: Tip 1 - Check the Continue logs My first tip is to always check the logs. Continue provides good logging inside the IDE, So go to the output tab and switch to the Continue source to get the generated output: Tip 2 – Check the Continue LLM logs Next to the output of Continue itself, you can find all the LLM specific logs in the Continue LLM Prompt/Conversation output. So don’t forget to check that output as well: Tip 3 – Be patient

Running a fully local AI Code Assistant with Continue–Part 5–Read your documentation

In a previous posted I introduced you to Continue in combination with Ollama, as a way to run a fully local AI Code Assistant. Remark: This post is part of a bigger series. Here are the other related posts: Part 1 – Introduction Part 2 -  Configuration Part 3 – Editing and Actions Part 4  -  Learning from your codebase Part 5 (this post) – Read your documentation Today I want to continue by having a look at how Continue can scrape your documentation website and make the content accessible inside your IDE The @docs context provider To use this feature you need to use the @docs context provider: Once you type @docs you already get a long list of available documentation: This is because Continue offers out-of-the-box a selection of pre-indexed documentation sites. (You can find the full list here ) If you now ask a question, the indexed documentation is used to answer your question: You can see the context used by expanding the context items section: Index you

Hangfire–Change polling interval

In modern .NET applications, background processing plays a crucial role in handling tasks that don't need to be executed immediately, such as sending emails, processing large datasets, or scheduling recurring jobs. A library I like to use to manage background tasks in .NET is Hangfire , thanks to its simplicity, flexibility, and wide range of storage options. As these background tasks can be scheduled in the future, some kind of storage needs to be configured where Hangfire keeps all the information related to background job processing. Out of the box multiple storage systems are supported, one of them being SQL Server . Although using SQL Server can be a convenient option to use as the main storage for your Hangfire, it comes with one big disadvantage:  Polling is used to fetch new jobs. This has 2 consequences: It increases the load on your SQL server instance The time to pickup a new job and execute it  is signifcantly longer(here is a comparison with Redis from t