Replies: 2 comments
-
See:https://github.com/vllm-project/flash-attention and https://github.com/vllm-project/vllm/blob/main/CMakeLists.txt#L507 |
Beta Was this translation helpful? Give feedback.
0 replies
-
thanks,it is here |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I want to debug the vllm_flash_attn code, but the release tar has no debug info. So I want to compile it, but I cannot find the source code , and do not know how to build it with source code. Need help.
Beta Was this translation helpful? Give feedback.
All reactions