🎉 Flash-DMA v0.1.0 - Initial Release #104
LoserCheems
announced in
Announcements
Replies: 1 comment
-
LGTM🤗 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
We're excited to announce the first official release of Flash-DMA (Flash Dynamic Mask Attention)!
🚀 What is Flash-DMA?
Flash-DMA is a high-performance attention implementation that combines:
✨ Key Features
🔥 Performance
🛠️ Multiple Backends
📏 Long Sequence Support
keep_window_size
📦 Installation
Prerequisites
Install from Source
What's Changed
New Contributors
Full Changelog: https://github.com/SmallDoges/flash-dmattn/commits/0.1.0
This discussion was created from the release 🎉 Flash-DMA v0.1.0 - Initial Release.
Beta Was this translation helpful? Give feedback.
All reactions