Skip to main content

Supercharge your AI Workflow with Github Copilot Custom Prompt Files

If you're using AI assistants in your development workflow, you've probably entered the same instructions dozens of times. What if I told you there's a better way? VS Code's custom prompt files let you create reusable, structured prompts with intellisense support—turning your ad-hoc AI interactions into a streamlined, professional workflow.

What are Custom Prompt files?

Custom prompt files are structured documents that define reusable prompts for AI assistants. Instead of manually typing out context, instructions, and tool configurations every time, you create a .prompt file that encapsulates everything in one place.

Think of them as templates for your AI conversations—but with superpowers.

Why superpowers?

VS Code prompt files support a header section where you can define metadata and configuration, including:

  • Tool specifications: Define which tools or capabilities the AI should have access to
  • Context information: Set the stage with project-specific details
  • The preferred agent: Choose the agent used for running the prompt: ask, edit, agent (default), or the name of a custom agent/chat mode.
  • Model preferences: Specify which AI model to use

Here's a simple example:

Inside the body of the prompt file, you can reference:

  • Other workspace files by using Markdown links.
  • Tools using the  #tool:<tool-name> syntax.
  • Variables using the ${variableName} syntax. You can reference the following variables:
    • Workspace variables - ${workspaceFolder}, ${workspaceFolderBasename}
    • Selection variables - ${selection}, ${selectedText}
    • File context variables - ${file}, ${fileBasename}, ${fileDirname}, ${fileBasenameNoExtension}
    • Input variables - ${input:variableName}, ${input:variableName:placeholder}

Intellisense to the rescue

Here's my favorite part: I discovered yesterday that VS Code provides intellisense when editing these prompt files. That's right—autocomplete, validation, and helpful suggestions as you type.

When you're defining your header section, VS Code will:

  • Suggest available tools and their correct syntax
  • Validate your configuration structure
  • Provide inline documentation for different options
  • Catch typos and structural errors before you even run the prompt

This means you're not guessing at the right format or constantly referring to documentation. The editor guides you through the process, making prompt engineering feel as natural as writing code.



Examples

Looking for some good examples? The prompts folder in the Awesome Copilot repo is the place to be!

awesome-copilot/prompts at main · github/awesome-copilot

Getting started

To get started we first need to create our prompt file:

  • Create a new file with the .prompt.md extension in the .github/prompts folder in the root of your workspace.
  • Or click on the gear icon in the chat window and choose Prompt files.
  • Add a header section with --- delimiters
    • Watch the intellisense magic happen as you type

 

  • Add your prompt content below the header
  • Save and reuse across your projects

You can now use this prompt file by using the ‘/<prompt name>’ in the chat window:

Remark: You can add extra information in the chat input field. For example, /create-react-form formName=MyForm .

    More information

    Use prompt files in VS Code

    awesome-copilot/prompts at main · github/awesome-copilot

      Popular posts from this blog

      Azure DevOps/ GitHub emoji

      I’m really bad at remembering emoji’s. So here is cheat sheet with all emoji’s that can be used in tools that support the github emoji markdown markup: All credits go to rcaviers who created this list.

      Kubernetes–Limit your environmental impact

      Reducing the carbon footprint and CO2 emission of our (cloud) workloads, is a responsibility of all of us. If you are running a Kubernetes cluster, have a look at Kube-Green . kube-green is a simple Kubernetes operator that automatically shuts down (some of) your pods when you don't need them. A single pod produces about 11 Kg CO2eq per year( here the calculation). Reason enough to give it a try! Installing kube-green in your cluster The easiest way to install the operator in your cluster is through kubectl. We first need to install a cert-manager: kubectl apply -f https://github.com/cert-manager/cert-manager/releases/download/v1.14.5/cert-manager.yaml Remark: Wait a minute before you continue as it can take some time before the cert-manager is up & running inside your cluster. Now we can install the kube-green operator: kubectl apply -f https://github.com/kube-green/kube-green/releases/latest/download/kube-green.yaml Now in the namespace where we want t...

      Podman– Command execution failed with exit code 125

      After updating WSL on one of the developer machines, Podman failed to work. When we took a look through Podman Desktop, we noticed that Podman had stopped running and returned the following error message: Error: Command execution failed with exit code 125 Here are the steps we tried to fix the issue: We started by running podman info to get some extra details on what could be wrong: >podman info OS: windows/amd64 provider: wsl version: 5.3.1 Cannot connect to Podman. Please verify your connection to the Linux system using `podman system connection list`, or try `podman machine init` and `podman machine start` to manage a new Linux VM Error: unable to connect to Podman socket: failed to connect: dial tcp 127.0.0.1:2655: connectex: No connection could be made because the target machine actively refused it. That makes sense as the podman VM was not running. Let’s check the VM: >podman machine list NAME         ...