Skip to content

No context swaps for llama-server #8310

Closed Answered by ggerganov
Tureti asked this question in Q&A
Discussion options

You must be logged in to vote

You need to send the cache_prompt with every request

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@ggerganov
Comment options

Answer selected by Tureti
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants