Can we run kaito on self managed k8s like (k3s, minikube, etc) running on windows PC with Nvidia GPU #876
-
The below comment suggested that it's quite possible, but how? Is there any article around this on how to please? Fei-Guo The consideration is that some will need for self-host community, home-labs and companies, etc that need the llms to be ran locally. You can run Kaito in selfmanaged k8s if you already add GPU nodes in the cluster (with proper gpu driver and k8s plugin installed). In this case, you can just add those nodes in the Kaito workspace CR as preferrednodes in the Resource spec. Kaito will skip provisioning gpu nodes and just run inference workload in the existing nodes. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Yes you can. Install the workspace controller following this guide.
If you have installed another node provisioning controller that supports Karpenter-core APIs, the steps for installing gpu-provisioner can be skipped.
|
Beta Was this translation helpful? Give feedback.
Yes you can. Install the workspace controller following this guide.
If you have installed another node provisioning controller that supports Karpenter-core APIs, the steps for installing gpu-provisioner can be skipped.
If you already have a gpu node ready, you can choose this node manually by setting
preferredNodes
. e.g.