Keep running llamacpp in background #10096
Unanswered
Fulgurance
asked this question in
Q&A
Replies: 1 comment 2 replies
-
you can run in a conversation mode for example: |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone, I did yet few tests now with llama-cli. I would like to know if there is a possibility to keep the chat mode running in background of the system, and send when necessary a request when it's needed to the running modele instead of restart the modele everytime.
Because starting the modele consume a bit of time.
Beta Was this translation helpful? Give feedback.
All reactions