Replies: 1 comment
-
I'm currently working on updating the documentation of version 3 to prepare for a stable release, so there's no documentation website for version 3 that I can provide you with a link to just yet. The solution you're looking for is to reset the chat history of a chat session, for example: import {fileURLToPath} from "url";
import path from "path";
import {getLlama, LlamaChatSession} from "node-llama-cpp";
const __dirname = path.dirname(fileURLToPath(import.meta.url));
const chatHistoryFilePath = path.join(__dirname, "chatHistory.json");
const llama = await getLlama();
const model = await llama.loadModel({
modelPath: path.join(__dirname, "models", "dolphin-2.1-mistral-7b.Q4_K_M.gguf")
});
const context = await model.createContext();
const session = new LlamaChatSession({
contextSequence: context.getSequence()
});
const q1 = "Hi there, how are you?";
console.log("User: " + q1);
const a1 = await session.prompt(q1);
console.log("AI: " + a1);
// after the first prompt, save the current chat history
const chatHistory = chatSession.getChatHistory();
const q2 = "What did I just ask you?";
console.log("User: " + q2);
const a2 = await session.prompt(q2);
console.log("AI: " + a2);
// revert the chat to the state after the first prompt
chatSession.setChatHistory(chatHistory);
const q2b = "What was my previous question?";
console.log("User: " + q2b);
const a2b = await session.prompt(q2b);
console.log("AI: " + a2b); |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I wish to just have a static chat session where I give it a multiturn session as a way to guide the models future responses
How do I go about freezing the chat history? I don't want to create a new session, manage several contexts, etc. as I have around 50 different inputs & going about the session and context management method causes my process to be killed in a few seconds (or it'll crash and show me a beautiful
IOT instruction (core dumped)
)It seems like a lot of work for something that should be baked directly into the program - surely static chat exists in some way? and if not, can it please be implemented?
I can't find decent documentation on the beta, so if it exists in the beta docs, no need to waste your time on a simple question like this, just the beta doc link should do 😄
Beta Was this translation helpful? Give feedback.
All reactions