Skip to content

CUDA: 4D FlashAttention support #14628

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

CUDA: fix WMMA FA kernel

2f9b295
Select commit
Loading
Failed to load commit list.
Sign in for the full log view
Merged

CUDA: 4D FlashAttention support #14628

CUDA: fix WMMA FA kernel
2f9b295
Select commit
Loading
Failed to load commit list.
labeler
succeeded Jul 11, 2025 in 7s