File tree Expand file tree Collapse file tree 2 files changed +4
-6
lines changed Expand file tree Collapse file tree 2 files changed +4
-6
lines changed Original file line number Diff line number Diff line change 17
17
18
18
## What is JAX?
19
19
20
- JAX is [ Autograd] ( https://github.com/hips/autograd ) and [ XLA] ( https://www.tensorflow.org/xla ) ,
21
- brought together for high-performance numerical computing, including
22
- large-scale machine learning research.
20
+ JAX is a Python library for accelerator-oriented array computation and program transformation,
21
+ designed for high-performance numerical computing and large-scale machine learning.
23
22
24
23
With its updated version of [ Autograd] ( https://github.com/hips/autograd ) ,
25
24
JAX can automatically differentiate native
Original file line number Diff line number Diff line change 1
1
JAX: High-Performance Array Computing
2
2
=====================================
3
3
4
- JAX is Autograd _ and XLA _, brought together for high-performance numerical computing.
4
+ JAX is a Python library for accelerator-oriented array computation and program transformation,
5
+ designed for high-performance numerical computing and large-scale machine learning.
5
6
6
7
If you're looking to train neural networks, use Flax _ and start with its documentation.
7
8
Some associated tools are Optax _ and Orbax _.
@@ -93,8 +94,6 @@ For an end-to-end transformer library built on JAX, see MaxText_.
93
94
glossary
94
95
95
96
96
- .. _Autograd : https://github.com/hips/autograd
97
- .. _XLA : https://openxla.org/xla
98
97
.. _Flax : https://flax.readthedocs.io/
99
98
.. _Orbax : https://orbax.readthedocs.io/
100
99
.. _Optax : https://optax.readthedocs.io/
You can’t perform that action at this time.
0 commit comments