I already showed in a previous post how you could integrate Semantic Kernel with the .NET Core LoggerFactory to see what is going on while interacting with your OpenAI backend.
Here is the link in case you missed it: Debugging Semantic Kernel in C# (bartwullems.blogspot.com).
An even better solution is to use the OpenTelemetry integration. Therefore we need to create a LoggerFactory
instance that uses OpenTelemetry as a logging provider:
using var loggerFactory = LoggerFactory.Create(builder => | |
{ | |
// Add OpenTelemetry as a logging provider | |
builder.AddOpenTelemetry(options => | |
{ | |
options.AddOtlpExporter(); | |
// Format log messages. This defaults to false. | |
options.IncludeFormattedMessage = true; | |
options.IncludeScopes = true; | |
}); | |
builder.SetMinimumLevel(LogLevel.Trace); | |
}); |
Now we need to register this LoggerFactory
as a service of the Semantic Kernel builder:
var semanticKernelBuilder = Kernel.CreateBuilder() | |
.AddOpenAIChatCompletion( // We use Semantic Kernel OpenAI API | |
modelId: "phi3", | |
apiKey: null, | |
endpoint: new Uri("http://localhost:11434"), | |
httpClient: client);// With Ollama OpenAI API endpoint | |
semanticKernelBuilder.Services.AddSingleton(loggerFactory); |
If we now take a look at our Aspire Dashboard, we could see the logged messages appear:
It is also possible to collect any related metrics and traces. Therefore add the following code to your Program.cs
:
using var traceProvider = Sdk.CreateTracerProviderBuilder() | |
.AddSource("Microsoft.SemanticKernel*") | |
.AddOtlpExporter() | |
.Build(); | |
using var meterProvider = Sdk.CreateMeterProviderBuilder() | |
.AddMeter("Microsoft.SemanticKernel*") | |
.AddOtlpExporter() | |
.Build(); |
If we now take a look at the Aspire Dashboard, we can see both the metrics and the end-2-end trace: