Skip to content

ModuleNotFoundError: No module named 'flash_attn' #175

@jingli-wtbox

Description

@jingli-wtbox

Error

Following the steps in readme to run finetune_llama-2-7b-32k-mqa.sh, got below error:

Traceback (most recent call last): File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 478, in <module> main() File "/home/ubuntu/training/OpenChatKit/training/dist_clm_train.py", line 443, in main pipe = get_pp_module(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_pp_utils.py", line 7, in get_pp_module return GpipeAsync(args, config, device, use_dp) File "/home/ubuntu/training/OpenChatKit/training/pipeline_parallel/dist_gpipe_pipeline_async.py", line 197, in __init__ self.model = _StageMiddle(args, config, device) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 130, in __init__ super(GPTStageMiddle, self).__init__(args, config) File "/home/ubuntu/training/OpenChatKit/training/modules/dist_gpt_pp_module.py", line 35, in __init__ from .llama_modules import GPTEmbeddings, GPTBlock, GPTLMHead File "/home/ubuntu/training/OpenChatKit/training/modules/llama_modules.py", line 37, in <module> from flash_attn.layers.rotary import ( ModuleNotFoundError: No module named 'flash_attn'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions