Get error when want to use Colossal-LLaMA-2 #5393
zhpacer
started this conversation in
Community | General
Replies: 1 comment
-
after copy the accelerator dir to the installed colossalai dir, can continue, but then get another error: seems, the version of colossalai is not right? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
CUDA: Build cuda_11.8.r11.8/compiler.31833905_0
following: https://github.com/hpcaitech/ColossalAI/tree/main/applications/Colossal-LLaMA-2
Error:
utils/flash_attention_patch.py", line 22, in
from colossalai.accelerator import get_accelerator
ModuleNotFoundError: No module named 'colossalai.accelerator'
Beta Was this translation helpful? Give feedback.
All reactions