Replies: 1 comment
-
Hi! Customizing prompts easily is the core goal of this plugin. Maybe what you're looking for is the require("llm").setup {
prompts = {
vicuna = { ... },
alpaca = { ... }
}
} |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Different open source instruction or code models use different prompt templates, in some cases the model's output degrades significantly if it is not prompted properly. There are several prompt templates like Alpaca template or ChatML template.
I think it could be useful to have a convenient of changing the templates.
Right now, I'm using a workaround. I copied the code from
$HOME/.local/share/nvim/lazy/llm.nvim/lua/llm/providers/llamacpp.lua
and changed localsSYSTEM_BEGIN
,SYSTEM_END
,INST_BEGIN
,INST_END
:Maybe there is a better way of doing this?
Beta Was this translation helpful? Give feedback.
All reactions