Skip to main content


Showing posts from May, 2022

Azure Application Insights–Set cloud role name in ASP.NET Core

One way to improve our Application Insights telemetry, is by setting the cloud role name yourself. I shared how to do this in an ASP.NET application in a previous post . In this post I'll show you how to use this approach in an ASP.NET Core application. The first steps are quite similar.  We still need to create a custom TelemetryInitializer. Therefore we create a class that implements the ITelemetryInitializer interface and implement the Initialize method: Now, in the Startup.ConfigureServices method, we register that telemetry initializer as a singleton: Or if you are using .NET 6 minimal API’s:

Azure Application Insights–Angular integration

To get the most out of Application Insights when building a Single Page Application, some specific tweaks are required. Let's walk through the steps to get Application Insights configured in your Angular application. I expect that you already configured an Application Insights resource in the Azure Portal and have an existing Angular application. Step 1 -  Add dependencies and configuration settings We’ll start by adding an extra dependency ‘@microsoft/applicationinsights-web’ to our package.json: We also update our environment.ts file to store our Application Insights instrumentation key there: Step 2 – Create an Angular Service to wrap the Application Insights SDK We create an Angular service that we will use as a wrapper around the Application Insights SDK. Here we inject the AppInsights class. Notice that we set both the ‘enableCorsCorrelation’ and ‘enableAutoRouteTracking’ to true . This will correctly inject a correlation-id in our request headers and will

Armchair Architects

You want to learn more on what it takes to be an architect? Or you are an architect right now and want to expand your knowledge? Check out Armchair Architects, a video series where David Blank-Edelman talks with 2 Microsoft Architects, Uli Homann and Eric Charran about different architecture related topics like resiliency, how the cloud is changing architecture, and so on. Subscribe yourself here : Remark: These videos are part of the Azure Enablement Show

Azure Application Insights–Log SQL query

One of the nice features that Application Insights has to offer is the ability to automatically track dependencies . Among the list of tracked dependencies you find database calls (made through System.Data.SqlClient or Microsoft.Data.SqlClient). By default the name of the server and database are collected together with the length of the call. But no extra information like the database query is logged. To log the database query as well, you explicitly need to enable this in ASP.NET Core: After this code change, you’ll find the SQL Command in the Command section instead of the name of the server and database: Remark: For ASP.NET applications, full SQL query text is collected with the help of byte code instrumentation, which requires the Microsoft.Data.SqlClient NuGet package instead of the System.Data.SqlClient library.

Azure Application Insights– Collect Performance counters data

One of the features you get out-of-the-box when using Application Insights is the ability to collect performance counter data. This allows you to track statistics like CPU, memory and disk usage. In ASP.NET Core the following counters are collected by default: % Process\Processor Time % Process\Processor Time Normalized Memory\Available Bytes Process\Private Bytes Process\IO Data Bytes/sec Processor(_Total)\% Processor Time When the performance counters are collected correctly, you can use them as one of the available metrics in Application Insights: Troubleshooting performance counters collection But what if no data is returned? First thing you should check, if you have a related log message. Therefore run the following KQL query inside the Logs section: traces | where message startswith "AI: Error collecting" | limit 10 In my case it returned the following error message: AI: Error collecting 6 of the configured perf

C# 10–File scoped namespaces

Last week I had a pair programming session with a candidate. The candidate had a background in C and was new to C#. While he was writing some code, he was struggling with namespaces in C#. It even got me confused as I completely forgot that C# 10 introduced a new feature 'file scoped namespaces' . I always had written namespaces in any of the following ways: The advantage of this approach is that you could have multiple namespaces in one file. But in reality I think I never used this possibility. The disadvantage is that traditional namespaces add some extra ceremony to your code. C# 10 gets rid of this boilerplate code when you want to use only one namespace per file. So the example above can be rewritten as: All the types we write in this file will be part of the X.Y.Z namespace. Update your existing code To update your existing code in one go, you can use the dotnet format tool: dotnet tool update --global dotnet-format dotnet format Example.sln --severity

Monitor RabbitMQ metrics in ElasticSearch

One of the reasons we are using ElasticSearch is to bring all the metrics from multiple systems together in one place. One of the systems we want to monitor is our RabbitMQ cluster. Let's walk through the steps to get our RabbitMQ metrics send to ElasticSearch and available in Kibana. The RabbitMQ Metrics Integration Open up Kibana and click on the hamburger icon in the left corner. This will open up a side menu with a big ‘Add integrations’ button at the bottom. Hit this button to go to the Integrations overview page. Here you can see the list of all available integrations. Let’s search for “rabbitmq”. We get 2 results back. Click on the ‘RabbitMQ Metrics’ integration to go to the installation instructions. Install MetricBeat on Windows The RabbitMQ metrics integration is part of MetricBeat . MetricBeat is a lightweight shipper that you can install on your servers to periodically collect metrics from the operating system and from services running on those ser

ElasticSearch - setting [cluster.initial_master_nodes] is not allowed when [discovery.type] is set to [single-node]

Upgrading ElasticSearch can be quite challenging(at least for me it was). I talked about some other issues I had while upgrading before . Let's add a last one to the list... When talking about the bootstrap check failures in a previous post , I shared my final solution. But if you were paying attention as a reader you maybe noticed that I didn’t follow the suggestions mentioned inside the bootstrap check error: bootstrap check failure [1] of [2]: the default discovery settings are unsuitable for production use; at least one of [discovery.seed_hosts, discovery.seed_providers, cluster.initial_master_nodes] must be configured I didn’t set the discovery.seed_hosts , discovery.seed_providers or cluster.initial_master_nodes setting as mentioned, instead I changed the discovery.type to single-node . In fact my first attempt looked like this: However this resulted in the following error message: [2022-05-09T11:33:18,943][WARN ][stderr                   ] [SERVER1] java

ElasticSearch Bootstrap check failure

Upgrading ElasticSearch can be quite challenging(at least for me it was). I talked about some other issues I had while upgrading before. Let's add another one to the list... One of the ways how ElasticSearch helps you to do the right things is to do some bootstrap checks . These bootstrap checks inspect a variety of Elasticsearch and system settings and compare them to values that are safe for the operation of Elasticsearch. If Elasticsearch is in development mode, any bootstrap checks that fail appear as warnings in the Elasticsearch log. If Elasticsearch is in production mode, any bootstrap checks that fail will cause Elasticsearch to refuse to start. In this case the bootstrap checks returned the following errors: [2022-05-09T09:53:42,647][ERROR][o.e.b.Bootstrap          ] [SERVER1] node validation exception [2] bootstrap checks failed. You must address the points described in the following [2] lines before starting Elasticsearch. bootstrap check failure [1] of [2]:

ElasticSearch- Unrecognized VM option 'UseConcMarkSweepGC'

Upgrading ElasticSearch can be quite challenging(at least for me it was). I talked about some other issues I had while upgrading before. Let's add another one to the list... After installing ElasticSearch 8.1 I couldn’t get it started. Here is the error I got instead: 12:31:34 » Exception in thread "main" java.lang.RuntimeException: starting java failed with [1] 12:31:34 » output: 12:31:34 » 12:31:34 » error: 12:31:34 » Unrecognized VM option 'UseConcMarkSweepGC' 12:31:34 » Error: Could not create the Java Virtual Machine. 12:31:34 » Error: A fatal exception has occurred. Program will exit. 12:31:34 » at 12:31:34 » at 12:31:34 » at 12:31:34 » at

ElasticSearch - BindTransportException: Failed to bind to 9300-9400

After installing a new ElasticSearch cluster I updated the elasticsearch.yml to expose it outside the local Virtual Machine. This is how the updated elasticsearch.yml looked like: There is definitely something wrong with this configuration as ElasticSearch refused to start. A look at the log file showed the following error information: [2022-05-09T09:45:02,104][ERROR][o.e.b.Bootstrap] [SERVER1] Exception org.elasticsearch.transport.BindTransportException: Failed to bind to[9300-9399] at org.elasticsearch.transport.TcpTransport.bindToPort( ~[elasticsearch-8.2.0.jar:8.2.0] at org.elasticsearch.transport.TcpTransport.bindServer( ~[elasticsearch-8.2.0.jar:8.2.0] at org.elasticsearch.transport.netty4.Netty4Transport.doStart( ~[?:?] at ~[?:?] at org.elastic

Learn GraphQL in one week

If you are new to GraphQL (or if you want to extend your existing knowledge), I have a great tip for you; In this free course you will build a fullstack eCommerce application with GraphQL Yoga, Prisma, Planetscale, Next.js, Tailwind CSS, & Stripe Checkout. In one week, you get to learn all the basic GraphQL concepts like queries, mutations, fragments,... , how to create your own GraphQL backend and consume your GraphQL backend using the Apollo GraphQL client. Check out this intro to see what you will learn during this week:

Publish a Static Web App on Azure

I had to create a small personal website. As a big fan of the Jamstack , I like to use a static website generator like next.js , gatsby or hugo . Before  I typically deployed this generated website on Azure Blob storage as a really cheap way to host a static website. But with the introduction of Azure Static Web Apps, it is time to give them a try… Create a Gatsby static website There is a large list of available website generators, but as I like GraphQL a lot, I decided to use Gatsby. First install the gatsby cli: npm install -g gatsby-cli Now you can  create a new website on your local machine using one of the available themes: npx gatsby new example-app Store the website on Github We need to store the code of our static website somewhere. This could be a Git repo on Azure DevOps, Github or any other Git enabled repository. We’ll use Github in this example. Create a new Github repository at h

TypeScript - The left-hand side of an assignment expression may not be an optional property access

It has been a while since the last time I did some front end development. So my TypeScript skills are kinda rusty. Today I tried to write the following code. In this code I'm subscribing to a Server-Send-Event and try to update an element on the screen(I'm not using Angular or any other framework for this small widget). The TypeScript compiler didn't like this and returned the following error: The left-hand side of an assignment expression may not be an optional property access The reason is that I try to use the optional chaining (?.) operator on the left-hand side of an assignment. To solve the error, I have to use an if statement that serves as a type guard instead before the assignment:

ElasticSearch–Uninstall failed on Windows

Yesterday I shared how I got into trouble when trying to uninstall an ElasticSearch instance. I explained how we could investigate the issue and find the root cause by checking the log files. Today I want to continue by explaining what I found in the log files and how I solved the problem. While browsing through the log file, I noticed the following error message: Calling custom action Elastic.Installer.Msi!Elastic.Installer.Msi.Elasticsearch.CustomActions.Immediate.ElasticsearchExistingDirectoriesAction.ElasticsearchExistingDirectories System.Exception: Can not execute Elastic.InstallerHosts.Elasticsearch.Tasks.Install.ExistingDirectoriesTask the model that it was passed has the following errors BADELASTICSEARCHYAMLFILE: The elasticsearch.yml file we found in ES_PATH_CONF appears to be invalid  and prevented seeding current values. It seems that ElasticSearch checks the elasticsearch.yml configuration file during the uninstall and doesn’t like what it finds there(which se

Troubleshoot an MSI uninstall

When trying to uninstall ElasticSearch through Add/Remove Programs in Windows, the uninstall silently failed. No error messages, no information in the Event Viewer, nothing... Let’s see how we can investigate what is going wrong. Therefore we’ll try to do the uninstall through the commandline instead of using the Add/Remove Programs feature in Windows. Uninstall an MSI from the commandline The command we need to use is msiexec . This can be used to install an MSI program. Of course we want to do an uninstall so we need to include the /x or /uninstall parameter. If you have the MSI file available you can use the following command: msiexec.exe /x “c:\elasticsearch.msi” If you don’t have the MSI file anymore you can do an uninstall using the Product GUID (I’ll show you how to get the product GUID below): msiexec.exe /x {11111111-1111-1111-1111-11111111111} Find the Product GUID of an installed Program There are multiple ways to get the Product GUID of the product you

Visual Studio 2022 - Check for memory leaks

When you see the memory usage increase in your applications, it can be a challenge to find the root cause. In that case capturing and analyzing memory dumps may be your last best option. In this post I'll show you how you can use Visual Studio to analyze your memory dumps. Create a dump file Before we can analyze a memory dump, we first need to create a dump file. One way to do this is through the dotnet monitor command line tool or you can do it directly from the Task Manager by right-clicking on the desired process and choosing Create dump file : Analyze the dump file in Visual Studio Now that we have a dump file, we can open it in Visual Studio. Open the dmp file in Visual Studio Choose Debug Managed Memory from the list of actions. Visual Studio will try to load all the symbols and analyze all the memory information in the dump file. Be patient as this can take a while. Once Visual Studio has done its work, you will see the Managed Memory vie

Red pill vs Blue pill -Outcome not output

After more than 15 years in the IT industry, working on both small and large projects in multiple industries and companies, I  have to make the following (unfortunate) observation: Although the raise of Agile and Product thinking, most organisations still operate in a project mindset not product mindset. To explain the difference between the two, I have to talk about outputs vs outcomes . What are outputs? Outputs can be seen as deliverables, something like a feature or a new product release. The succes of a project oriented team  is measured in output(e.g. Velocity; how much story points did we deliver in this sprint?). Did the team succeed in delivering features according specification and on time? A consequence of this mindset is that the team is not focussed on whether a delivered features got used or if the feature really solved the business problem. We may assume that the delivered features will lead to specific outcomes. We may even share those assumptions w

C#–Handling cancellation requests

When creating an async method in C#, I typically add the option to pass a CancellationToken . That is the easy part. But what if a consumer of your method uses this CancellationToken to request a cancellation? What is the proper way to cancel your code? Let's find out... In most cases, it will be sufficient to pass the cancellation token to a lower-level API, but if we are providing the lowest-level API it is up to us to correctly handle the cancellation request. The correct way to handle cancellations is by throwing an OperationCanceledException when a cancellation is requested. To help you with this, a convenient ThrowIfCancellationRequested() method exists on the CancellationToken object. Here are some other considerations to take into account when implementing cancellation: Don’t cancel if you operation incures side-effects that would leave the system in an inconsistent state. Don’t throw an OperationCancelledException when the work has already completed.

Azure SQL Performance

No database platform is immune for potential performance issues. So any tool that can help me spot performance issues is a welcome addition to my toolbox. Recently I noticed the following project on Github: . This project contains a Powershell script that can find some common causes of performance issues in Azure SQL Server: Check if the statistics If number of rows in the statistics is different of rows_sampled. If we have more than 15 days that the statistics have been updated. Check if we have any auto-tuning recomendations Check if the statistics associated to any index is: If number of rows in the statistics is different of rows_sampled. If we have more than 15 days that the statistics have been updated. Check if MAXDOP is 0 Check if we have an index with more than 50% fragmented Check if we have missing indexes

NuGet Central Package Management

The way that your NuGet package dependencies are managed changed over the years. Originally, the list of NuGet packages used by a project where managed inside a packages.config file . More recently, the concept of a <PackageReference /> got introduced that allowed you to manage your NuGet dependencies directly within your project file . Both approaches manage dependencies at the project level. This is not ideal for multi-project solutions as every project can use different versions of the same dependency. Visual Studio can help you to consolidate package versions but it remains cumbersome when your solutions start to grow. Starting with NuGet 6.2 , you can centrally manage your dependency versions in your projects with the introduction of a Directory.Packages.props file. Getting started To get started with central package management, create a Directory.Packages.props file at the root of your solution. Remark: It is possible to use multiple Directory.Packages.