Skip to content

Add better XPU support, particularly for onnx ops #4

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

exdysa
Copy link
Owner

@exdysa exdysa commented Apr 26, 2025

您的程式非常棒,在您的許可下,我可以使用您的程式來改進我的基本XPU支援嗎?

wlpxxx added 2 commits February 9, 2025 06:10
onnxruntime-directml
pytorch2.5.1+xpu
tensorrt模型似乎只支持f16
@exdysa exdysa closed this Apr 26, 2025
@exdysa exdysa reopened this Apr 26, 2025
@eighteen-k-gold-malow eighteen-k-gold-malow closed this by deleting the head repository May 4, 2025
@exdysa
Copy link
Owner Author

exdysa commented May 4, 2025

;_;

@eighteen-k-gold-malow
Copy link

随便你

@exdysa
Copy link
Owner Author

exdysa commented May 4, 2025

随便你

請原諒我的任何無禮行為。
你有XPU裝置,我沒有。 我的版本不起作用,所以經您的允許,我想將您的程式與我的程式相結合。 我希望這還能完成嗎? 由於它被刪除了,所以需要更長的時間 >_<

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants