Skip to content

Commit 993abb1

Browse files
author
jax authors
committed
Merge pull request #19985 from jakevdp:doc-tagline
PiperOrigin-RevId: 615843373
2 parents f99284e + 72b2321 commit 993abb1

File tree

2 files changed

+4
-6
lines changed

2 files changed

+4
-6
lines changed

README.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,8 @@
1717

1818
## What is JAX?
1919

20-
JAX is [Autograd](https://github.com/hips/autograd) and [XLA](https://www.tensorflow.org/xla),
21-
brought together for high-performance numerical computing, including
22-
large-scale machine learning research.
20+
JAX is a Python library for accelerator-oriented array computation and program transformation,
21+
designed for high-performance numerical computing and large-scale machine learning.
2322

2423
With its updated version of [Autograd](https://github.com/hips/autograd),
2524
JAX can automatically differentiate native

docs/index.rst

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
JAX: High-Performance Array Computing
22
=====================================
33

4-
JAX is Autograd_ and XLA_, brought together for high-performance numerical computing.
4+
JAX is a Python library for accelerator-oriented array computation and program transformation,
5+
designed for high-performance numerical computing and large-scale machine learning.
56

67
If you're looking to train neural networks, use Flax_ and start with its documentation.
78
Some associated tools are Optax_ and Orbax_.
@@ -93,8 +94,6 @@ For an end-to-end transformer library built on JAX, see MaxText_.
9394
glossary
9495

9596

96-
.. _Autograd: https://github.com/hips/autograd
97-
.. _XLA: https://openxla.org/xla
9897
.. _Flax: https://flax.readthedocs.io/
9998
.. _Orbax: https://orbax.readthedocs.io/
10099
.. _Optax: https://optax.readthedocs.io/

0 commit comments

Comments
 (0)