Replies: 2 comments
-
Beta Was this translation helpful? Give feedback.
0 replies
-
验证显卡【部署ktransformers量化模型的必需条件】: RTX2080TI不支持,此贴终结。 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
我的电脑配置:i5 12400、D5 192G、RTX2080TI22G,目前在win10上能用LM Studio运行模型:
DeepSeek-V2.5-1210-Q5_K_M、DeepSeek-R1-UD-IQ1_M,如何在ubuntu24桌面版系统运行,这些「cuda、python、torch、ktransformers」的版本如何选择?我想直接用ktransformers-0.2.0+cu124torch24avx2-cp312-cp312-linux_x86_64.whl这种文件直接安装,最好优先用whl文件安装,有没有哪位大佬有这两个模型的详细的安装步骤?能否为目前消费级电脑支持的最大内存192G单独写一些100G以上模型的详细的部署教程「尽量用whl文件安装」?大部分用户不可能单独装一个服务器去跑模型,但会去升级主力电脑。
Beta Was this translation helpful? Give feedback.
All reactions