We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 981eeca commit a045b7eCopy full SHA for a045b7e
cmake/external_projects/vllm_flash_attn.cmake
@@ -38,7 +38,7 @@ else()
38
FetchContent_Declare(
39
vllm-flash-attn
40
GIT_REPOSITORY https://github.com/vllm-project/flash-attention.git
41
- GIT_TAG 763ad155a1c826f71ff318f41edb1e4e5e376ddb
+ GIT_TAG 2c6bcfc0feb3d9d4a57b243fc159a68aa9933f5b
42
GIT_PROGRESS TRUE
43
# Don't share the vllm-flash-attn build between build types
44
BINARY_DIR ${CMAKE_BINARY_DIR}/vllm-flash-attn
test-qwen
@@ -0,0 +1 @@
1
+Subproject commit 34c31c0af8fc975140b8c85548fefa1eb7f523e4
0 commit comments