I’ m currently working on a new feature in one of our microservices. As this new feature could have large performance impact, we did some performance benchmarks up front through BenchMarkDotNet. The results looked promising but we were not 100% confident that these results are representative for real life usage. Therefore we decided to implement A/B testing.
Yesterday I showed how to implement A/B testing in .NET using .NET Feature Management. Today I want to continue on the same topic and show you how we added telemetry.
Measuring and analyzing results
The most critical aspect of A/B testing is measuring results. You need to track relevant metrics for each variation to determine which performs better. The most simple way to do this is to fall back to the built-in logging in .NET Core :
Although this is a good starting point, we can simplify and improve this by taking advantage of the built-in telemetry through OpenTelemetry and/or Application Insights.
Using OpenTelemetry
As I want to compare the request performance, it is sufficient to add an extra tag to the existing activity. This can easily be done through Activity.Current.SetTag()
:
Inside our Aspire Dashboard this extra tag is visible at the span level:
Using Application Insights
In theory, when using Application Insights, you should not change anything as Application Insights should be able to pick up the information from the Activity context. However I noticed that nothing was added to the traced requests.
I solved it by directly calling the active telemetry instance and adding the tag there:
Now the extra tag appeared as a custom dimension:
More information
.NET feature flag management - Azure App Configuration | Microsoft Learn
.NET Observability with OpenTelemetry - .NET | Microsoft Learn
Application Insights for ASP.NET Core applications - Azure Monitor | Microsoft Learn