Replies: 3 comments 3 replies
-
This falls under one of the secondary objectives in the "p1 : LLM-based code completion engine at the edge 1" announcement. Dropping the link here for visibility. |
Beta Was this translation helpful? Give feedback.
-
I've actually been working on something similar - split-window LLM responses with keyboard shortcuts for prompting and code insertion. If it helps with your implementation decisions or saves you some work, check it out: VimLM. Happy to answer questions about the approach I took for the UI. Just thought it might be relevant to what you're discussing. |
Beta Was this translation helpful? Give feedback.
-
I have been using https://github.com/madox2/vim-ai before I discovered that llama.vim existed. It has what you wish for I think, see AIEdit and AIChat in general. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
It would be cool to see a chat feature implemented in Vim so that the user can ask questions about their codebase. This is inspired by many of the leading code editors like Cursor.ai, Github Copilot, and Windsurf.
A simple example of how this what this would look like:

This would increase the utility of this tool by making better use of llama-server for coding assistance.
I'm aware that this project is focused on 'Local LLM-assisted text completion'. Does this feature fall within the scope of this project?
Beta Was this translation helpful? Give feedback.
All reactions