Microsoft announced last month, the .NET Smart Components, an experimental set of AI-powered UI components that can be added to your .NET apps. They asked us to give these components a try and share our feedback.
In a first post I tried the Smart Combobox. Now let’s have a look at the Smart TextArea component.
The idea of the Smart TextArea is that it gives you a smart autocomplete that can be tailored to the specific context you want to use it in. It looks at what the user is currently typing and tries to make suggestions based on the configured context and tone.
It feels quite similar to prompt engineering but with a focus on helping you typing a text faster and easier.
Here is an example use case from the documentation:
Your app might allow agents to respond to customer/staff/user messages by typing free text, e.g., in a live chat system, support ticket system, CRM, bug tracker, etc. You can use Smart TextArea to help those agents be more productive, writing better-phrased responses with fewer keystrokes.
Time to give it a try!
Integrate Smart TextArea in an ASP.NET Core MVC app
The first steps are the same as in the previous post, but I’ll repeat them here.
We start by creating a new ASP.NET Core MVC application.
Remark: The Smart Components are supported in both Blazor and MVC/RazorPages applications.
Add the SmartComponents.AspNetCore NuGet package to your project:
dotnet add package --prerelease SmartComponents.AspNetCore
Open your Program.cs
file and add the following lines to register the necessary services:
Now open the _ViewImports.cshtml
file (in the Views
folder) and reference the Smart Component tag helpers:
Okay. Now we can finally add our Smart TextArea to our Razor view. To tweak the component to our context, we need to specify a user role that describes who is typing and for what reason, optionally giving other contextual information:
Next to a user role, we can specify an array user phrases that helps the language model reply using your preferred tone/voice, common phrases, and give any information you wish about policies, URLs, or anything else that may be relevant to incorporate into the suggested completions:
Integrate a self-hosted AI backend
We are not there yet. Before we can use our Smart TextArea we need to provide an AI backend that our component can use to talk to a Large Language Model.
In this example I’ll use Ollama to self host an AI backend and Llama2 as the LLM.
Remark: If you want to learn how to use Ollama, check out this post.
Update your appsettings.json
file and add the following section:
Make sure that the DeploymentName matches the deployed model. I’m using Llama3 that just got available inside Ollama.
We also need to register an IInferenceBackend
. As Ollama exposes an OpenAI compatible API we can use a package already created by Microsoft. Therefore we first need to install the SmartComponents.Inference.OpenAI package through NuGet:
dotnet add package --prerelease SmartComponents.Inference.OpenAI
Now we can register it inside our Program.cs
file:
Let’s give it a try…
Now we can give our component a try. We run our application and start typing:
You can see that suggestions are added after typing a first part of the sentence. Nice!
Remark: Be aware that the quality of the results can vary drastically depending on the used language model.
More information
smartcomponents/docs/smart-textarea.md at main · dotnet-smartcomponents/smartcomponents (github.com)