In a previous posted I introduced you to Continue in combination with Ollama, as a way to run a fully local AI Code Assistant. Remark: This post is part of a bigger series. Here are the other related posts: Part 1 – Introduction Part 2 - Configuration Part 3 – Editing and Actions Part 4 - Learning from your codebase Part 5 – Read your documentation Part 6 (this post) – Troubleshooting Although Continue really looks promising, I stumbled on some hurdles along the way. Here are some tips in case you encounter issues: Tip 1 - Check the Continue logs My first tip is to always check the logs. Continue provides good logging inside the IDE, So go to the output tab and switch to the Continue source to get the generated output: Tip 2 – Check the Continue LLM logs Next to the output of Continue itself, you can find all the LLM specific logs in the Continue LLM Prompt/Conversation output. So don’t forget to check that output as well: Tip 3 – Be patient
In a previous posted I introduced you to Continue in combination with Ollama, as a way to run a fully local AI Code Assistant. Remark: This post is part of a bigger series. Here are the other related posts: Part 1 – Introduction Part 2 - Configuration Part 3 – Editing and Actions Part 4 - Learning from your codebase Part 5 (this post) – Read your documentation Today I want to continue by having a look at how Continue can scrape your documentation website and make the content accessible inside your IDE The @docs context provider To use this feature you need to use the @docs context provider: Once you type @docs you already get a long list of available documentation: This is because Continue offers out-of-the-box a selection of pre-indexed documentation sites. (You can find the full list here ) If you now ask a question, the indexed documentation is used to answer your question: You can see the context used by expanding the context items section: Index you