Skip to main content

Phind–Your personal programming assistant

With the release of ChatGPT, GitHub Copilot, Amazon Code Whisperer just to name a few, large language models are the (new?) cool kid in town and you see a lot of new applications pop up trying to claim a part of this space.

If you are still in doubt if these tools can help improve your developer productivity, check out this survey executed and published by GitHub.

92% of developers already saying they use AI coding tools at work and in their personal time, which makes it clear AI is here to stay. 70% of the developers we surveyed say they already see significant benefits when using AI coding tools, and 81% of the developers we surveyed expect AI coding tools to make their teams more collaborative—which is a net benefit for companies looking to improve both developer velocity and the developer experience.

The list of available tools is long and keeps growing every day. Here are some I’m aware of:

And of course let us not forget ChatGPT.

Today I want to add another one to the list; Phind. Phind uses a combination of GPT-4 and their own model. This model should hallucinate less and write better code.

I started by asking to create a small application using the Task Parallel Library(TPL) DataFlow in C#:

 

Here is the exact prompt I was using:

I want to create a new C# application using the Task Parallel Library. This application should read a CSV file and parse it using multiple datablocks. Can you give me an example on how to write this code?
The result is not bad although the created example doesn’t take advantage of the TPL as it first reads all the data into memory. Let’s see if we can fix this:

This gives the following result:

The Main method became async and we got rid of the Wait() statement as we wanted. This is much better and more in line with the asynchronous nature of the Task Parallel Library.

Love it! (Of course it is again a good example on how you as a developer still need to understand what is going on so you can hint the AI assistent to improve the code).

Remark: You can also use it directly inside VSCode using this plugin.

If you compare this with what I got back from ChatGPT, the example created by Phind is much more what I expected:

And just for completeness, this is what I got back when asking the same question at GitHub Copilot Chat:

Not so good either. Phind is a clear winner int this example…

Remark: I tried some related prompts to further improve the result I got back from ChatGPT and GitHub Copilot but I never got to the result I got from Phind.

Popular posts from this blog

.NET 8–Keyed/Named Services

A feature that a lot of IoC container libraries support but that was missing in the default DI container provided by Microsoft is the support for Keyed or Named Services. This feature allows you to register the same type multiple times using different names, allowing you to resolve a specific instance based on the circumstances. Although there is some controversy if supporting this feature is a good idea or not, it certainly can be handy. To support this feature a new interface IKeyedServiceProvider got introduced in .NET 8 providing 2 new methods on our ServiceProvider instance: object? GetKeyedService(Type serviceType, object? serviceKey); object GetRequiredKeyedService(Type serviceType, object? serviceKey); To use it, we need to register our service using one of the new extension methods: Resolving the service can be done either through the FromKeyedServices attribute: or by injecting the IKeyedServiceProvider interface and calling the GetRequiredKeyedServic...

Azure DevOps/ GitHub emoji

I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

Kubernetes–Limit your environmental impact

Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...