Has anyone tried to use some modules from VLLM to replace torch.nn.functional.scaled_dot_product_attention? #4546
galaxy202411
announced in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I mean replace torch.nn.functional.scaled_dot_product_attention only. So, other parts of codes can be kept.
Beta Was this translation helpful? Give feedback.
All reactions