Skip to content

how to set systemPrompt in code? #67

Closed Answered by giladgd
eisneim asked this question in Q&A
Discussion options

You must be logged in to vote

You can pass a custom systemPrompt when you create a LlamaChatSession object.

Regarding your wizardCoder example, you can just customize the GeneralChatPromptWrapper with the role names you mentioned:

import {fileURLToPath} from "url";
import path from "path";
import {
    LlamaModel, LlamaContext, LlamaChatSession, GeneralChatPromptWrapper
} from "node-llama-cpp";

const __dirname = path.dirname(fileURLToPath(import.meta.url));

const model = new LlamaModel({
    modelPath: path.join(__dirname, "models", "codellama-13b.Q3_K_M.gguf"),
    promptWrapper: new GeneralChatPromptWrapper({
        instructionName: "Instruction",
        responseName: "Response"
    })
});
const context = new Ll…

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by giladgd
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants