Skip to content

Commit 52f5f70

Browse files
author
jax authors
committed
Merge pull request #20840 from sagelywizard:patch-2
PiperOrigin-RevId: 626472429
2 parents 8424824 + c980dc4 commit 52f5f70

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

cloud_tpu_colabs/Pmap_Cookbook.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@
1919
"\n",
2020
"This notebook is an introduction to writing single-program multiple-data (SPMD) programs in JAX, and executing them synchronously in parallel on multiple devices, such as multiple GPUs or multiple TPU cores. The SPMD model is useful for computations like training neural networks with synchronous gradient descent algorithms, and can be used for data-parallel as well as model-parallel computations.\n",
2121
"\n",
22-
"**Note:** To run this notebook with any parallelism, you'll need multiple XLA devices available, e.g. with a multi-GPU machine, a Google Cloud TPU or a Kaggle TPU VM. The required features are not supported by the Google Colab TPU runtime at this time.\n",
22+
"**Note:** To run this notebook with any parallelism, you'll need multiple XLA devices available, e.g. with a multi-GPU machine, a Colab TPU, a Google Cloud TPU or a Kaggle TPU VM.\n",
2323
"\n",
2424
"The code in this notebook is simple. For an example of how to use these tools to do data-parallel neural network training, check out [the SPMD MNIST example](https://github.com/google/jax/blob/main/examples/spmd_mnist_classifier_fromscratch.py) or the much more capable [Trax library](https://github.com/google/trax/)."
2525
]

0 commit comments

Comments
 (0)