a few questions #45
-
|
Beta Was this translation helpful? Give feedback.
Answered by
giladgd
Sep 23, 2023
Replies: 1 comment
-
|
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
giladgd
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
llama.cpp
supports some parameters you can configure to customize the behavior of CUDA.You can read more about it here and then set the parameters you need as environment variables when you compile using
node-llama-cpp download --cuda
and when you run your codegpuLayers