Skip to main content

Posts

Showing posts from 2023

Azure Static Web App–Traffic splitting

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV – Password protect your environments Part V(this post) – Traffic splitting Yesterday I talked about limiting access to your staging environment by password protecting it. This allows you to work with a limited set of test users who can access the staging environment(assuming they got the visitor password). However sometimes you want to do something like a canary deployment where we redirect a small subset of your users in production to a new version of your application. This is something that is also possible in Azure Static Web Apps through the concept of Traffic splitting. To activate this feature go to your Azure static web app resource in the Azure portal: Go to Environments .

Azure Static Web App–Password protect your environments

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III  – Deploying to multiple environments Part IV(this post) – Password protect your environments Yesterday I showed you how you can have multiple environments for your Azure Static Web App. However not every environment is production-ready or should be accessible to everyone. To limit access you can password protect your staging or all environments. To enable this feature, you need to open the Azure Poral and go to your static web app resource. Go to Settings –> Configuration . Switch to the General Settings tab. Change the Password protection setting from Disabled to Protect staging environments only to protect only your app's pre-production environments or Protect both production and staging environments to protect all environm

Azure Static Web App–Deploying to multiple environments

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II - Using the Astro Static Site Generator Part III (this post) – Deploying to multiple environments So far we have deployed our application to one environment, the production environment. Your custom domain points to this environment, and content served from this location is indexed by search engines. However it is possible to use 3 other types of (staging) environments: Pull requests : Pull requests against your production branch deploy to a temporary environment that disappears after the pull request is closed. The URL for this environment includes the PR number as a suffix. For example, if you make your first PR, the preview location looks something like <DEFAULT_HOST_NAME>-1.<LOCATION>.azurestaticapps.net . Branch : You can optionally configure your site to deploy eve

Azure Static Web Apps - Using the Astro Static Site Generator

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. Part I - Using the VS Code Extension Part II(this post) - Using the Astro Static Site Generator Yesterday I showed you how to deploy your first Static Web App using the VSCode extension. As an example I used a static website created using Astro . What I didn’t mention is that the first time the Github Actions workflow tried to build and deploy the application it failed with the following error message: Build error: Node.js v16.20.2 is not supported by Astro! Please upgrade Node.js to a supported version: ">=18.14.1" The problem is that the Github Action that is created by the extension assumes the use of node 14. As Astro requires a more recent Node.js version the Astro build fails. To resolve this we need to update our package.json to explicitly specify a node version by adding an engines section:

Azure Static Web Apps – VS Code extension

As a follow-up on the presentation I did at CloudBrew about Azure Static Web Apps I want to write a series of blog posts. In this first post I’ll focus on the Azure Static Web apps extension for VS Code . After installing this extension you can quickly create your Static Web App directly from the VS Code IDE. Open a folder that contains your frontend app in VS Code. Go to the Azure Extension and click on the ‘+’ sign. Choose Create Static Web App from the list of options. Specify a name for the new SWA. Select a region for your staging environments and the Managed Azure Function(if you have one). Now we can choose a preset to configure the default project structure. Azure Static Web Apps will use this information to determine things like the location of our code, build location, etc… In our example we have created a static web site using Astro , a static site generator. As no preset exists for Astro we need to select the Custom option. Now

.NET 8– Upgrade warnings

As our policy is to keep our .NET applications on the latest LTS version, we started to upgrade our application portfolio to .NET 8. Changing the Target Framework Moniker and updating the ASP.NET Core NuGet packages was sufficient to get the upgrade done.  However the upgrade also triggered some warnings and because we had ‘Treat warnings as errors’ enabled in Visual Studio, our applications refused to compile. SYSLIB0051 The first warning we got was SYSLIB0051 : Severity Code Description Project File Line Suppression State Warning SYSLIB0051 'Exception.Exception(SerializationInfo, StreamingContext)' is obsolete: 'This API supports obsolete formatter-based serialization. It should not be called or extended by application co

C#– Record types copy constructor

C# record types are designed to be immutable by default. When using a C# record type, you can create a new instance based on an existing instance using the with expression. Behind the scenes the compiler will generate a Clone method that calls a copy constructor: Although a default copy constructor is generated for you, you can create your own if you want to:

Hangfire - Passing arguments using private setters

Hangfire is one of the easiest ways to embed background processing in your .NET (Core) applications. It does all the hard work for you so you can focus on the application logic itself. Scheduling a recurring job is as easy as using this oneliner: As you can see background jobs in Hangfire look like regular method calls. Hangfire will  collect and serialize the method information using the Newtonsoft.Json package and store the resulting JSON. In real life applications I typically need to pass some arguments to the background job which is supported as well: In the code above the Message object is serialized as well. Although this is correctly serialized and stored by Hangfire, trouble arrives when this scheduled task is actually invoked. Hangfire will deserialize our Message object but the private properties will remain Null . If you don’t mind having a dependency on Newtonsoft.Json, you can fix it by adding a [JsonProperty] attribute: Or the option that I prefer i

Building distributed systems–Retry storms

With the popularity of microservices, distributed systems have become the norm. Although distributed systems can certainly help ensuring scalability and resilience, they come with their fair share of challenges. One particularly I want to talk about today is the retry storm. What is a retry storm? A retry storm occurs when a large number of clients simultaneously attempt to retry failed requests, overwhelming the system and exacerbating the initial issue. In this blog post, we'll explore the causes of retry storms and provide practical strategies to avoid them. Let’s image we have a range of services calling each other: If each service has a retry policy installed which retries 3 times(resulting in a total of 3+1 requests), this would result in 64 times(!) the normal traffic. If retry policies are executed without context in cause traffic to grow exponentially. How can we avoid it? Some measures that can help avoid a retry storm are: Jittered Exponential Backoff:

Azure Functions - Can't determine project language from files.

Thanks to the Azure Functions Core Tools you can develop and test your functions on your local computer. However when I tried to run my function locally (through func start ), it failed with the following error message: Can't determine project language from files. Please use one of [--csharp, --javascript, --typescript, --java, --python, --powershell, --custom] The problem was caused because I didn’t had a local.settings.json file available locally.This file is required to let the local runtime detect the project language used for my function. To fix it you can either run func init or create a local.settings.json file manually and add it to the project directory: Make sure that you have set the  FUNCTIONS_WORKER_RUNTIME setting to the correct language. More information Develop Azure Functions locally using Core Tools | Microsoft Learn

Cloudbrew 2023–Azure Static Web Apps

Yesterday I had the honor to speak at CloudBrew , a 2 day conference organized by AZUG, the Belgium Microsoft Azure User Group. Last year I talked about Azure Application Insights , this year I did a presentation about Azure Static Web Apps. Again it was a really fun experience with great sessions, nice people, and the Gouden Carolus Triple at the end makes it perfect. I’m already looking forward to next year… Azure Static Web Apps – Too good to be true? As a lot of people are still unaware about Azure Static Web Apps(SWA) and the great things it had to offer, I decided to change this and brought a presentation sharing all the great things it had to offer. Discover the power of Azure Static Web Apps in this session, where we'll explore how it simplifies web development. Learn how to seamlessly integrate static front-end frameworks with serverless back-end APIs powered by Azure Functions. Explore features like effortless deployment, serverless back-end integration, cust

PowerBI–Access to the resource is forbidden

When trying to connect to an Azure Data Lake Storage Gen 2 using PowerBI, it failed with the following error message: "Access to the resource is forbidden" . The first thing I tried was resetting my credentials(just in case). Therefore I went to File –> Options and settings –> Data source settings. There I clicked on Clear permissions . Unfortunately that didn’t help. Although I could access the data lake data directly in the Azure Portal, it turned out that I didn’t had enough rights to access the resource through PowerBI. To fix it I had to go to the Azure Portal and add any of the following rights: Blob Data Reader , Blob Data Contributor , or Blob Data Owner .

PowerBI–Load a parquet file from an Azure Data Lake Storage

In our Azure Data Lake Storage we have data stored in parquet files. Reading this data in PowerBI is not that hard. In this post I'll walk you through the steps to get this done. Start by opening PowerBI and click on Get data from another source . Choose Azure Data Lake Storage Gen2 from the list of available sources and click on Connect . Enter the url of your Azure Data Lake Storage and click on OK. Now you get a list of available files found in the data lake. We don’t want to use these files directly but transform them, so click on Transform Data . This will open up the Power Query editor .  Click here on Binary next to the parquet file we want to extract. This will add an extra step to our Power Query that parses the parquet file and extracts all the data. Click on Close & Apply to apply the changes to our query and start using the results. That’s it! More information Azure Data Lake Storage Gen2 - Power Query | Microsoft Learn Analyze data in Az

NuGet 6.8–Package vulnerability notifications

Starting from 6.8 , NuGet will audit PackageReference packages and warn you if any have known vulnerabilities similar to what NPM does when using npm install . This works when using dotnet restore: And also when using Visual Studio: Nice! More information Auditing package dependencies for security vulnerabilities | Microsoft Learn

Git–Discard local changes

Git offers a wide range of possibilities to discard local changes. In this post I like to share some of the available options. Before I dive into the details, it is important to make the distinction between untracked and tracked files. From the documentation : Git has something called the "staging area" or "index". This is an intermediate area where commits can be formatted and reviewed before completing the commit.   Untracked files live in the git working directory but are not managed by git until you stage them. Tracked files Here are some options to discard changes in tracked files: Discard Changes in a Specific File: If you want to discard changes in a specific file, you can use the following command: git checkout -- filename This will replace the changes in the specified file with the last committed version of that file. Discard Changes in All Modified Files: To discard changes in all modified files in the workin

.NET 8–JSON Source Generator improvements

If you don’t know what the JSON source generator is, I suggest to first check this older post before you continue reading. Still there? OK! Today I want to focus on some improvements in the JSON source generator that were introduced with the release of .NET 8. Although the JSON source generator is a great (performance) improvement, it couldn’t handle some of the more recent language features like required members and init-only properties . If you tried to use these features in combination with source generators you get the following warning before .NET 8: Starting from .NET, full support for required and init members has been added and the code above will just work:

MassTransit–Quorum queues

Mirrored queues have been a feature in RabbitMQ for quite some time. When using mirrored queues messages are replicated across multiple nodes providing high availability in a RabbitMQ cluster. Each mirrored queue has a master and one or more mirrors, and messages are replicated from the master to its mirrors. Mirrored Queues operate on synchronous replication, meaning that the master node waits for at least one mirror to acknowledge the receipt of a message before considering it successfully delivered. This impacts performance and can result in throughput issues due to the synchronous nature of replication. Certain failure scenarios can result in mirrored queues confirming messages too early, potentially resulting in a data loss. Quorum queues Quorum Queues are a more recent addition to RabbitMQ, introduced to address some of the limitations posed by Mirrored Queues. They use a different replication model based on the Raft consensus algorithm. In this model, each queue is repli

.NET 8 and C# 12–Overview

If you want to see how .NET and C# evolved over time, check out the updated overview created by nietras : Check out his post for more details and a PDF version of the image above.

Company vs company

In the English vocabulary, the word 'company' has 2 meanings: Company: an organization that sells goods or services in order to make money And Company: the fact of being with a person or people, or the person or people you are with I think it is no coincidence that the same word is used for both explanations. Origin The word "company" traces its origins back to the Latin term "com-" meaning "together with" and "panis" meaning "bread" . In its earliest usage, "company" referred to a group of people who shared meals together, highlighting the communal aspect of coming together around a common table. This initial meaning laid the foundation for the word's dual evolution, branching into both social and business contexts. Social Company: In its more informal sense, "company" refers to a gathering of individuals for social interaction or mutual enjoyment. The shared origin with

ADFS - MSIS5007: The caller authorization failed for caller identity

Our ASP.NET Core applications typically use WS-Federation with ADFS as our Identity Provider. After configuring a new application(Relying Party) in ADFS the first authentication attempt failed with the following error message: Encountered error during federation passive request. A look at the event viewer gave us more details: Protocol Name: wsfed Relying Party: https://localhost/example/ Exception details: Microsoft.IdentityServer.RequestFailedException: MSIS7012: An error occurred while processing the request. Contact your administrator for details. ---> Microsoft.IdentityServer.Service.IssuancePipeline.CallerAuthorizationException: MSIS5007: The caller authorization failed for caller identity <domain>\<ADUser> for relying party trust https://localhost/example/ .    at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result)    at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result)    at Microsoft.Id

Find a subset from a set of values whose sum is closest to a specific value–C#

I got an interesting question from my girlfriend last week: Given I have a list of numbers, I want to select a subset of numbers that added up matches closest to a specific (positive) value. Let me give a simplified example to explain what she was asking for: If our list is [12, 79, 99, 91, 81, 47] and the expected value is 150, it should return [12, 91, 47] as 12+91+47 is 150. If our list is [15, 79, 99, 6, 69, 82, 32] and the expected value is 150 it should return [69, 82] as 69+82 is 151, and there is no subset whose sum is 150. This turns out to be known as the Subset sum problem and is a computational hard problem to solve. Luckily the list of numbers she needs to work with is quite small (about 50 numbers) and we can easily brute force this. Yesterday I explained how this problem can be solved in Excel, but what is the fun in that?! Let us have a look on how we can do this in C#. With some help of Github Copilot I came up with the following solution: Let us

Find a subset from a set of values whose sum is closest to a specific value– Excel

I got an interesting question from my girlfriend last week: Given I have a list of numbers, I want to select a subset of numbers that added up matches closest to a specific (positive) value. Let me give a simplified example to explain what she was asking for: If our list is [12, 79, 99, 91, 81, 47] and the expected value is 150, it should return [12, 91, 47] as 12+91+47 is 150. If our list is [15, 79, 99, 6, 69, 82, 32] and the expected value is 150 it should return [69, 82] as 69+82 is 151, and there is no subset whose sum is 150. This turns out to be known as the Subset sum problem and is a computational hard problem to solve. Luckily the list of numbers she needs to work with is quite small (about 50 numbers) and we can easily brute force this. Today I want to show you how we can tackle this problem in Excel using the Solver add-in. Activate the Solver Add-In: Go to the "File" tab. Click on "Options." In the Excel Opti