Skip to main content

Posts

Showing posts from June, 2021

Avoiding index fragmentation with sequential guids

Using a non sequential GUID in an index is not a good idea as it leads to index fragmentation and decreased performance. We could switch to an identity field but this is not ideal in a highly distributed (micro)services architecture. RT.Comb to the rescue! RT.Comb implements the “COMB” technique, as described by Jimmy Nilsson, which replaces the portion of a GUID that is sorted first with a date/time value. This guarantees (within the precision of the system clock) that values will be sequential, even when the code runs on different machines. RT.Comb is available as a NuGet package and provides different strategies for generating the timestamp optimized for different database platforms: RT.Comb.Provider.Legacy : The original technique. Only recommended if you need to support existing COMB values created using this technique. RT.Comb.Provider.Sql : This is the recommended technique for COMBs stored in Microsoft SQL Server. RT.Comb.Provider.Postgre : This is the reco

MassTransit–Use record types for your message contracts

So far I always used an interface for my message contracts: In combination with anonymous types, you don’t even need to implement this interface yourself but create an anonymous type on the fly that will be published: Let’s see if we can do the same thing by using a record type instead. Here is our rewritten message contract: We could publish the message in the same way as before: BUT you can also take advantage of the "target-typing" feature in C# 9, this allows us to create a specific instance of the record type instead of an anonymous type: Did you notice the difference? We call the constructor of the record type through the 'new ()' syntax.

Getting started with React.js

If you want to get started with React.js , there is a lot of information out there. I can recommend having a look at React book, your beginner guide to React . It’s a completely free ebook on React.js with all the basic knowledge you need to start building React applications. Table of content Basics JSX Setup First component Props State Methods Thinking in Components Conditional Styling Styled components Images Importing images Routing Routing Router and query parameters Programmatic navigation Lazy loading Advanced Context API Hooks render-props Testing Jest nock react-testing-library Redux Redux basics Actions Reducers Store Adding Redux to React Sagas, side effects

Azure Functions with built-in OpenAPI support

With the latest Visual Studio update, a new Azure Function template was added; “Http Trigger with OpenAPI”. This template will create a new function with the necessary implementation for OpenAPI support. Let’s try this: Open Visual Studio. Choose ‘Create a new Project’.  Search for the ‘Azure Functions’ template and click Next . Specify a name and location for the project and click Create . Now you can choose a specific Azure Functions template. Select the ‘Http Trigger with OpenAPI’ template and click Create . The new function is bootstrapped with the necessary implementation for OpenAPI support. Extra attributes are added on top of your Function method. When you run the function app, you can browse to ‘/api/swagger/ui’ to view the Swagger UI page.

WebDeploy - EnableMsDeployAppOffline

At one of my customers we are using WebDeploy for years to deploy our web applications to IIS. Although not that well documented, it works great and has good Visual Studio integration. When reviewing the release pipeline of one of my colleagues I noticed that they introduced tasks in the release pipeline to stop the application pool before deploying the package and start the application pool again once the deployment completed. This is probably done because ASP.NET Core applications hosted in IIS run in-place and lock the files that they are running from. So if we don’t stop the application pool, web deploy will fail as it cannot replace these files. Although this solution works, WebDeploy has a built-in alternative; the EnableMsDeployAppOffline flag. When set to true this flag, WebDeploy will create an app_offline.htm file which unloads the running application, publishes files, then removes that file. Sidenote: About app_offline.htm The app_offline.htm is a long exi

Azure Pipelines - A template expression is not allowed in this context

In my attempt to further optimize our deployment process, I tried to move most of the pipeline logic to a YAML template . This is how the template looked like(I simplified it a little bit): However when I tried to create a pipeline using this template I got the following error message: /cd-template.yml (Line: 14, Col: 14): A template expression is not allowed in this context The problem is that I’m trying to use a template expression inside a pipeline resource . First I was thinking I did something wrong but then I found the following announcement in the release notes : Previously, compile-time expressions ( ${{ }} ) were not allowed in the resources section of an Azure Pipelines YAML file. With this release, we have lifted this restriction for containers. This allows you to use runtime parameter contents inside your resources, for example to pick a container at queue time. We plan to extend this support to other resources over time. As stated above, compile tim

NuGet Package Explorer–Compiler Flags

After creating a NuGet package as part of my build pipeline, I opened the NuGet package through the NuGet Package explorer to doublecheck if everything is ok. Unfortunately I got the following warning in the Compiler Flags section: Present, not reproducible When I hovered over the warning icon, I got the following extra information: Ensure you’re using at least the 5.0.300 SDK or MSBuild 16.10 The reason I got this error is that on the build server a newer .NET 5 SDK was installed and used to build this package. I could easily verify this by calling the following command : dotnet --list-sdks The output should show something like this: 5.0.204 [C:\Program Files\dotnet\sdk] As you can see I didn’t have the 5.0.300 SDK installed. Fixing it can be done by either updating Visual Studio to the latest version or installing the latest .NET SDK.

Visual Studio 2019–Manage Docker Compose launch settings

With the latest Visual Studio update, the Docker Compose tooling got improved and it is not possible to create a launch profile to run any combination of services defined in your compose files. Before you only had once launch profile and you couldn’t choose what services to start. Let’s find out how to use this new feature: Right-click on you docker-compose project and select Manage Docker Compose Launch Settings The tooling will scan through all Compose files to find all services. Once the scanning is completed you can choose which services to launch Create a new profile by clicking New… and specify a name Now we can configure which services to launch for this profile. Also choose the Launch Service name and action . Once you are done , click OK A new launch profile is created and available to use

Visual Studio 2019 - Create a Docker compose file for an existing solution

Visual Studio makes it really easy to create a Docker compose file for an existing solution. Here are the steps to get there: Open your solution in Visual Studio Right click on one of your projects and choose Add –> Container Orchestrator Support Choose Docker compose in the Add Container Orchestrator dialog and click OK Choose Linux as the Target OS and click OK A new Docker Compose project is generated Inside this project you find a docker-compose.yml file with a reference to the project you’ve selected To add the other projects, follow the same procedure; right click on another project and choose Add –> Container Orchestrator Support again The docker-compose.yml file will be updated with the new project information

Azure Pipelines - Use pipeline variables

I’m currently migrating an existing CI/CD pipeline build in Azure DevOps from the ‘classic’ build approach to YAML templates. In our old setup we had 2 steps: A CI pipeline that builds our application, runs all tests and packages the application in a Docker container. This container is then published to Azure Container Registry. A Release pipeline that is triggered once the CI pipeline completes. The CI pipeline is available as an artifact. We use the CI pipeline ‘branchname’ together with the ‘buildid’ to find the correct image inside ACR and deploy it: example.azurecr.io/cms-spa:$(Release.Artifacts._CMS - CI.SourceBranchName).$(Release.Artifacts._CMS - CI.BuildId) To achieve the same thing through Azure Pipelines and YAML templates we need to first define the CI build as a Pipeline Resource which I explained in this post: https://bartwullems.blogspot.com/2021/06/azure-pipelinespipeline-resource-trigger.html . Once the pipeline resource is set, we can also

Azure DevOps - Branch policy path filters

Last week I got a tip from a colleague(Thx Sam!). During a ‘Tech sync’ we were discussing on how to avoid committing secrets in your source repository. Of course there exists tools that scan for credentials inside your repository but these tools have to be configured and are not perfect. Another way to do this by introducing a Reviewer policy together with a path filter in Azure DevOps.   By setting a path filter, the branch policy is only applied when files matching the filter are changed. Typical places where application secrets are added are config files, application settings, … Let’s define some paths to check: /Config/* *.json *.config To combine multiple paths you can use ; as a separator: /Config/*;*.json;*.config To apply this configuration for a repository, go to the cross repositories settings (e.g. /_settings/repositories">https://dev.azure.com/<organization name>/_settings/repositories ). Go to the ‘Automatically include code r

Visual Studio 2019–Editorconfig UI

As I read more code than I write, code consistenty and readability are really important for me. That is why I like the .editorconfig and one of the reasons why I blogged about it before: Introducing .editorconfig: https://bartwullems.blogspot.com/2017/04/visual-studio-2017editorconfig.html Generate an .editorconfig: https://bartwullems.blogspot.com/2019/10/visual-studiogenerate-editorconfig-file.html The ‘dotnet format’ command: https://bartwullems.blogspot.com/2020/04/dotnet-format.html Let private fields start with an underscore: https://bartwullems.blogspot.com/2019/12/editorconfig-let-private-fields-start.html Code cleanup in Visual Studio: https://bartwullems.blogspot.com/2019/10/visual-studio-2019code-cleanup.html Although I really like EditorConfig files, configuring them is not that easy. If you agree then I have great news for you! Starting from Visual Studio 16.10 an EditorConfig designer was added that allows you to easily view and configure your cod

FluentNhibernate–Use Microsoft.Data.SqlClient

Microsoft released version 3 of the Microsoft.Data.SqlClient . This .NET Data Provider for SQL Server provides general connectivity to the database and supports all the latest SQL Server features for applications targeting .NET Framework, .NET Core, and .NET Standard. It can be used as a replacement for the built-in System.Data.SqlClient that will still be available for a long time but doesn’t support newer SQL Server features. I thought the 3.0 release was a good reason to make the switch in my .NET Core applications. As I’m using NHibernate (together with FluentNHibernate) I had to do some configuration work to get this working. Remark: If you are using EF Core, there is nothing you need to do if you are using version 3.0 or higher. From that version on the Microsoft SQL Server EF Core provider uses Microsoft.Data.SqlClient by default. The steps to get it working with (Fluent)NHibernate are short and easy: Step 1 -  Add a reference to Microsoft.Data.SqlClient Step 2

Domain Driven Design–The first 15 years

If you are into DDD and you want to have a heads up what happened in the DDD community since the release of the “ blue book ” by Eric Evans, Domain Driven Design – The first 15 years is a must read. Fifteen years after the publication of "Domain-Driven Design: Tackling Complexity in the Heart of Software" by Eric Evans, DDD is gaining more adoption than ever. To celebrate the anniversary, we've asked prominent authors in the software design world to contribute old and new essays.   The book was released in 2015 so maybe it becomes time for another update, but until that happens you can read this edition.

Visual Studio–Create your CI/CD pipeline using Github Actions

When looking through the preview features activated inside Visual Studio (Options > Environment > Preview Features) I noticed the ‘GitHub Actions support in Publish’ feature: Let’s try it to see what it does… Open the Start Window in Visual Studio. Click on the Clone a Project button. Enter the repository location and local path and click on Clone . Go to the Solution Explorer . Right click on the Solution and choose Publish from the context menu Let’s publish our Application to Azure. So choose Azure from the list of available Targets and click on Next . Now we need to choose a specific target. Let’s choose Azure App Service and click on Next . On the next screen, we need to choose the App Service Instance we want to use. After doing that, click on Next . As a last step, we need to select the deployment type. It is here that the preview feature appears, as we can choose CI/CD using GitHub Actions . Click on Finish.

Azure Pipelines - Pass location of baked manifest files

Inside our CD pipeline, I first want to bake a manifest file through Kustomize (more about Kustomize in a later blog post) and then use the baked manifest to deploy to our AKS cluster. Both actions are possible through the Kubernetes Manifest task , but I didn’t find immediatelly how I could pass the location of the baked manifest bundle from the bake task to the deployment task. Here is the original yaml file I created: The trick is to give the first task a specific name and point to the ‘manifestsBundle’ property through the task name. Let’s update our yaml file to show this. In the example below I named the bake task ‘bake’ so I can use ‘$(bake.manifestBundle)’ in the second task:

Azure Pipelines - Error deploying manifest to Kubernetes cluster

I was trying to deploy a manifest through the Kubernetes manifest task   but the task failed with the following error message:               error: error validating "/home/vsts/work/_temp/Ingress_tags-api-ingress_1622817055216": error validating data: [ValidationError(Ingress.spec.rules[0].http.paths[0].backend): unknown field "serviceName" in io.k8s.api.networking.v1.IngressBackend, ValidationError(Ingress.spec.rules[0].http.paths[0].backend): unknown field "servicePort" in io.k8s.api.networking.v1.IngressBackend]; if you choose to ignore these errors, turn validation off with --validate=false Before I was using the Kubectl task which didn’t complain at all!? The Kubernetes manifest task validates the manifest before it deploys it (which is in fact a good thing). So let’s have a look what is wrong with the manifest I try to deploy… Here is the original manifest file: The problem is that I’m still using the beta syntax for the ingress cont

Azure Pipelines–Artifacts

In Azure Pipelines you have the concept of Artifacts. Artifacts can be things like compiled code coming from a CI build, a Docker container, another source repository and so on… These artifacts can be used inside your Release pipeline to deploy these artifacts on one or more environments. When switching to YAML pipelines I couldn’t find the concept of artifacts inside the schema definition . Turns out that inside the YAML template, an artifact is defined by a ‘ Resource ’. I’ll show you 2 examples to explain how to use them. Use the output of another pipeline as a Pipeline Resource Most important settings are: The source name: This should match to the name of the pipeline that creates the artifact Trigger: This defines if this pipeline should be triggered when the pipeline you are pointing to completes. Use another repository as a Repository Resource Most important settings are: Repository: This should match to the name of the repository you want to use

GraphQL– Optional arguments

Every field on a GraphQL object type can have zero or more arguments, for example the length field below: An argument can be either required or optional. For optional arguments, you can specify a default value, like  METER in the example above. To configure this using HotChocolate , you can use the following syntax:

ASP.NET Core - Handle cancelled requests

When an HTTP request is made to your ASP.NET Core application, it is always possible that the request is aborted. For instance when the user closes it browsers without waiting for the response. In this case, you may want to stop all the work to avoid consuming resources. Here are 2 possible approaches on how to handle this scenario. A first option is to check the HttpContext.RequestAborted property:   A second option is to let the Modelbinder do it’s work and add a parameter of type CancellationToken to the action:

Azure Kubernetes Service - Time sync issue between nodes

We encountered a strange issue this week inside our AKS cluster. We discovered that the time was not synced between the different pods and nodes. We noticed this because we couldn’t use our  OAuth security tokens as the IssuedAt timing was off. To validate this issue we ssh’d into the nodes and ran the following command: $: sudo timedatectl status This resulted in the following output Local time: Wed 2021-6-2 13:48:44 UTC Universal time: Wed 2021-6-2 13:48:44 UTC RTC time: Wed 2021-6-2 13:48:44 Time zone: Etc/UTC (UTC, +0000) Network time on: yes NTP synchronized: no RTC in local TZ: no The NTP service was disabled and no NTP service was configured. To fix it we opened the timesyncd.conf: $: sudo cat /etc/systemd/timesyncd.conf and changed the NTP value [Time] NTP=ntp.ubuntu.com After that we restarted the timesync service: $: sudo timedatectl set-ntp true $: sudo systemctl res

Azure Pipelines–Pipeline Resource Trigger

Yesterday I started blogging about my journey moving from the ‘classic’ build approach to YAML templates. I shared how you can use a Build completion trigger to link your YAML build to a previously completed build. Although this approach works, it is no longer recommended. A better way is to use ‘ Pipeline Resource Triggers’ .  This is done by defining a pipelines resource inside your YAML template. pipelines is a dedicated resource only for Azure Pipelines. Let’s have a look at the syntax: In your resource definition, pipeline is a unique value that you can use to reference the pipeline resource later on. source is the name of the pipeline that produces an artifact. Remark: As I mentioned in the example above, the source name is case sensitive. More information: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/pipeline-triggers?view=azure-devops