On Friday 8th of August 2025 I was a bit frustrated with the lack of AI assistant support with Qt Creator!
Previously I only used GitHub Copilot with Qt Creator, but I didn’t like the idea of the LLM not running locally, running somewhere in the cloud.
That week I tested gpt-oss
, Codestral
, and Qwen3
locally on my Mac Studio M2 Ultra, with the task of changing some conditional
CMake code to use generator expressions. I remember Qwen3
doing the best from all of them.
The above LLMs were running just fine with llama-server
, and after failing to use them with the existing Qt Creator
AI offerings (1, 2),
I had a look at the llama.cpp website and, found out about their
coding assistants. llama.vim and
llama.vscode, but that was it. No Qt Creator support
Then I noticed that llama.vim
was implemented in one single file. Next, I thought, what if I ask Qwen3-Coder-30B-A3B
something like:
given the vim plugin delimited by triple backicks: [...] can you create a similar Qt Creator plugin ?
To my surprise it started with:
Qt Creator Plugin for LLaMA Infill Completion
I’ll create a Qt Creator plugin that provides similar functionality to the vim plugin you’ve shared. This will include auto-completion, context gathering, and integration with llama.cpp server.
Plugin Structure
The plugin will consist of:
- Core plugin class with initialization
- Auto-completion system with FIM (Fill-In-the-Middle)
- Context gathering and caching mechanisms
- Integration with Qt Creator’s editor framework
- Configuration UI
Let me implement this as a complete plugin: