Need help updating Kohya to work with CUDA 12.8 #3381
Mescalino-Tech
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
So i managed to update torch but:
11:53:13-657560 INFO Kohya_ss GUI version: v24.1.7
11:53:14-356431 INFO Submodule initialized and updated.
11:53:14-360434 INFO nVidia toolkit detected
11:53:19-603879 INFO Torch 2.9.0.dev20250813+cu128
11:53:19-725893 INFO Torch backend: nVidia CUDA 12.8 cuDNN 91002
11:53:19-729897 INFO Torch detected GPU: NVIDIA GeForce RTX 5060 Ti VRAM 16310 Arch (12, 0) Cores 36
11:53:19-743900 INFO Python version is 3.10.11 (tags/v3.10.11:7d4cc5a, Apr 5 2023, 00:38:17) [MSC v.1929 64 bit
(AMD64)]
11:53:19-745901 INFO Verifying modules installation status from requirements_pytorch_windows.txt...
11:53:19-747904 WARNING Package wrong version: torch 2.9.0.dev20250813+cu128 required 2.1.2+cu118
11:53:19-748904 INFO Installing package: torch==2.1.2+cu118 --index-url https://download.pytorch.org/whl/cu118
11:55:20-924620 ERROR Error running pip: install --upgrade torch==2.1.2+cu118 --index-url
https://download.pytorch.org/whl/cu118
11:55:20-926613 WARNING Package wrong version: torchvision 0.24.0.dev20250817+cu128 required 0.16.2+cu118
11:55:20-927614 INFO Installing package: torchvision==0.16.2+cu118 --index-url
https://download.pytorch.org/whl/cu118
11:55:26-811020 INFO Verifying modules installation status from requirements_windows.txt...
11:55:26-815023 INFO Verifying modules installation status from requirements.txt...
11:56:05-351163 INFO headless: False
11:56:05-436554 INFO Using shell=True when running external commands...
K:\stable-diffusion\stable-diffusion-webui\extensions\kohya_ss\venv\lib\site-packages\gradio\analytics.py:106: UserWarning: IMPORTANT: You are using gradio version 4.43.0, however version 4.44.1 is available, please upgrade.
warnings.warn(
Running on local URL: http://127.0.0.1:7860
It says i need an older version of torch. But with the older version it says i need to update to cu128
And when i start training a LoRA it fails.
Beta Was this translation helpful? Give feedback.
All reactions