-
Notifications
You must be signed in to change notification settings - Fork 129
Open
Description
I installed bonito version 0.9.0 with pip install ont-bonito
and flash-attention with pip install flash-attn --no-build-isolation
on a Linux machine. I then ran bonito with one of the transformer models (dna_r10.4.1_e8.2_400bps_sup@v5.0.0
). The following messages appeared:
> reading pod5
> outputting unaligned bam
> loading model dna_r10.4.1_e8.2_400bps_sup@v5.0.0
please install flash-attn to use the transformer module: `pip install flash-attn --no-build-isolation`
Traceback (most recent call last):
File "[...]/venv-bonito/lib64/python3.9/site-packages/bonito/nn.py", line 441, in from_dict
layer = typ(**model_dict)
File "[...]/venv-bonito/lib64/python3.9/site-packages/bonito/transformer/model.py", line 95, in __init__
self.self_attn = MultiHeadAttention(
File "[...]/venv-bonito/lib64/python3.9/site-packages/bonito/transformer/model.py", line 55, in __init__
self.rotary_emb = RotaryEmbedding(self.rotary_dim, interleaved=False)
NameError: name 'RotaryEmbedding' is not defined
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "[...]/venv-bonito/bin/bonito", line 8, in <module>
sys.exit(main())
[...]
File "[...]/venv-bonito/lib64/python3.9/site-packages/bonito/nn.py", line 443, in from_dict
raise Exception(f'Failed to build layer of type {typ} with args {model_dict}') from e
Exception: Failed to build layer of type <class 'bonito.transformer.model.TransformerEncoderLayer'> with args {'d_model': 512, 'nhead': 8, 'dim_feedforward': 2048, 'deepnorm_alpha': 2.4494897, 'deepnorm_beta': 0.2886751, 'attn_window': [127, 128]}
It seems like the problem is caused by an incompatibility with the latest version of flash-attn, v2.8.0.post2. The problem was solved by installing an earlier version of flash-attention:
pip install --no-build-isolation flash-attn==2.7.4.post1
Perhaps this solution could be helpful to other users.
Metadata
Metadata
Assignees
Labels
No labels