How to Run a Model on a GPU with bentoml in k8s? #4279
Unanswered
2232729885
asked this question in
General
Replies: 1 comment 1 reply
-
Did you include |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
below is my bentofile.yaml:
'service: "service.py:svc"
include:
python:
index_url: https://pypi.tuna.tsinghua.edu.cn/simple
requirements_txt: "requirements.txt"
docker:
python_version: "3.8"
cuda_version: "11.7.1"'
after bentoml build and containerize, i start a pod in k8s. I can run cmd 'nvidia-smi' in pod, the result is:

but no process run on gpu? so what i should do ?
Beta Was this translation helpful? Give feedback.
All reactions