Skip to content

Open source two simplical attention kernels #4445

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

choutim
Copy link
Contributor

@choutim choutim commented Jul 3, 2025

Summary:
Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Differential Revision: D77756574

Copy link

netlify bot commented Jul 3, 2025

Deploy Preview for pytorch-fbgemm-docs ready!

Name Link
🔨 Latest commit 508b0c0
🔍 Latest deploy log https://app.netlify.com/projects/pytorch-fbgemm-docs/deploys/687061235e9c7b000844fbc4
😎 Deploy Preview https://deploy-preview-4445--pytorch-fbgemm-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77756574

@choutim choutim force-pushed the export-D77756574 branch from 6ac3f33 to a0828ae Compare July 3, 2025 23:45
choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 3, 2025
Summary:

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Differential Revision: D77756574
@choutim choutim force-pushed the export-D77756574 branch from a0828ae to 57e09f1 Compare July 3, 2025 23:46
choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 3, 2025
Summary:

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Differential Revision: D77756574
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77756574

choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 3, 2025
Summary:
Pull Request resolved: pytorch#4445

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Differential Revision: D77756574
@choutim choutim force-pushed the export-D77756574 branch from 57e09f1 to 162905d Compare July 3, 2025 23:49
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77756574

choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 3, 2025
Summary:
Pull Request resolved: pytorch#4445

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Differential Revision: D77756574
@choutim choutim force-pushed the export-D77756574 branch from 162905d to 6aac7d6 Compare July 3, 2025 23:54
@choutim choutim force-pushed the export-D77756574 branch from 6aac7d6 to d77d8e8 Compare July 11, 2025 00:39
choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 11, 2025
Summary:

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Reviewed By: jiecaoyu, sijiac

Differential Revision: D77756574
@choutim choutim force-pushed the export-D77756574 branch from d77d8e8 to 8464ef1 Compare July 11, 2025 00:40
choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 11, 2025
Summary:

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Reviewed By: jiecaoyu, sijiac

Differential Revision: D77756574
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77756574

choutim added a commit to choutim/FBGEMM that referenced this pull request Jul 11, 2025
Summary:
Pull Request resolved: pytorch#4445

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Reviewed By: jiecaoyu, sijiac

Differential Revision: D77756574
@choutim choutim force-pushed the export-D77756574 branch from 8464ef1 to 55e7fc5 Compare July 11, 2025 00:44
Summary:
Pull Request resolved: pytorch#4445

X-link: facebookresearch/FBGEMM#1508

Kernels for two simplical attention of form:
L = Q @ (K1 X K2)
P = softmax(L, axis=[-1, -2])
O = P @ (V1 X V2)

Reviewed By: jiecaoyu, sijiac

Differential Revision: D77756574
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D77756574

@choutim choutim force-pushed the export-D77756574 branch from 55e7fc5 to 508b0c0 Compare July 11, 2025 00:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants