i just bought a laptop, rtx 3060 running docker llama.cpp:full-cuda with CUDA error 35 at ggml-cuda.cu:5509: CUDA driver version is insufficient for CUDA runtime version #3182
Replies: 2 comments 2 replies
-
i am using 11.5.119 on my host but the full cuda dockerfile is using 11.7 how to resolve? |
Beta Was this translation helpful? Give feedback.
1 reply
-
@staviq thanks! i stayed up all night trying to "fix" this. but now the problem is, i realised the driver is actually 12.2 |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
i just bought a laptop, rtx 3060 running docker llama.cpp:full-cuda with CUDA error 35 at ggml-cuda.cu:5509: CUDA driver version is insufficient for CUDA runtime version
current device: 21996
how to resolve this?
im using gguf airoboros 13b from thebloke huggingface
Beta Was this translation helpful? Give feedback.
All reactions