Replies: 1 comment 1 reply
-
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Thank you very much for this project, it's really great. I would like to know if the improved Torch XLA has been tested for the maximum number of TPUs it can support when scaling on TPU. Because previously we found that on the original Torch XLA, some unknown errors occurred when the number of TPUs exceeded a certain amount. Also, has there been any testing of the training speed difference between Torch XLA and JAX under the same conditions on TPU?
Beta Was this translation helpful? Give feedback.
All reactions