Skip to content

Commit 0cbab9e

Browse files
authored
Documentation headings & sections (#2056)
* move docs around * add a page for Zygote * fixup * move NNlib's activation functions to their own page * restore callback helpers * re-name advanced... page * tweak Zygote page * shape inference sounds better * move functors down
1 parent 8bc0c35 commit 0cbab9e

File tree

16 files changed

+259
-202
lines changed

16 files changed

+259
-202
lines changed

docs/Project.toml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,13 @@
11
[deps]
22
BSON = "fbb218c0-5317-5bc6-957e-2ee96dd4b1f0"
3+
ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4"
34
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
45
Functors = "d9f16b24-f501-4c13-a1f2-28368ffc5196"
56
MLUtils = "f1d291b0-491e-4a28-83b9-f70985020b54"
67
NNlib = "872c559c-99b0-510c-b3b7-b6c96a88d5cd"
78
OneHotArrays = "0b1bfda6-eb8a-41d2-88d8-f5af5cad476f"
89
Optimisers = "3bd65402-5787-11e9-1adc-39752487f4e2"
10+
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"
911

1012
[compat]
1113
Documenter = "0.27"

docs/make.jl

Lines changed: 19 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,41 +1,45 @@
1-
using Documenter, Flux, NNlib, Functors, MLUtils, BSON, Optimisers, OneHotArrays
1+
using Documenter, Flux, NNlib, Functors, MLUtils, BSON, Optimisers, OneHotArrays, Zygote, ChainRulesCore
22

33

44
DocMeta.setdocmeta!(Flux, :DocTestSetup, :(using Flux); recursive = true)
55

66
makedocs(
7-
modules = [Flux, NNlib, Functors, MLUtils, BSON, Optimisers, OneHotArrays],
7+
modules = [Flux, NNlib, Functors, MLUtils, BSON, Optimisers, OneHotArrays, Zygote, ChainRulesCore, Base],
88
doctest = false,
99
sitename = "Flux",
10-
strict = [:cross_references,],
10+
# strict = [:cross_references,],
1111
pages = [
1212
"Home" => "index.md",
1313
"Building Models" => [
1414
"Overview" => "models/overview.md",
1515
"Basics" => "models/basics.md",
1616
"Recurrence" => "models/recurrence.md",
17-
"Model Reference" => "models/layers.md",
17+
"Layer Reference" => "models/layers.md",
1818
"Loss Functions" => "models/losses.md",
1919
"Regularisation" => "models/regularisation.md",
20-
"Advanced Model Building" => "models/advanced.md",
21-
"Neural Network primitives from NNlib.jl" => "models/nnlib.md",
22-
"Recursive transformations from Functors.jl" => "models/functors.md"
20+
"Custom Layers" => "models/advanced.md",
21+
"NNlib.jl" => "models/nnlib.md",
22+
"Activation Functions" => "models/activation.md",
2323
],
2424
"Handling Data" => [
25-
"One-Hot Encoding with OneHotArrays.jl" => "data/onehot.md",
26-
"Working with data using MLUtils.jl" => "data/mlutils.md"
25+
"MLUtils.jl" => "data/mlutils.md",
26+
"OneHotArrays.jl" => "data/onehot.md",
2727
],
2828
"Training Models" => [
2929
"Optimisers" => "training/optimisers.md",
30-
"Training" => "training/training.md"
30+
"Training" => "training/training.md",
31+
"Callback Helpers" => "training/callbacks.md",
32+
"Zygote.jl" => "training/zygote.md",
3133
],
3234
"GPU Support" => "gpu.md",
33-
"Saving & Loading" => "saving.md",
34-
"The Julia Ecosystem" => "ecosystem.md",
35-
"Utility Functions" => "utilities.md",
35+
"Model Tools" => [
36+
"Saving & Loading" => "saving.md",
37+
"Shape Inference" => "outputsize.md",
38+
"Weight Initialisation" => "utilities.md",
39+
"Functors.jl" => "models/functors.md",
40+
],
3641
"Performance Tips" => "performance.md",
37-
"Datasets" => "datasets.md",
38-
"Community" => "community.md"
42+
"Flux's Ecosystem" => "ecosystem.md",
3943
],
4044
format = Documenter.HTML(
4145
analytics = "UA-36890222-9",

docs/src/community.md

Lines changed: 0 additions & 5 deletions
This file was deleted.

docs/src/data/onehot.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,8 @@ julia> onecold(ans, [:a, :b, :c])
5151

5252
Note that these operations returned `OneHotVector` and `OneHotMatrix` rather than `Array`s. `OneHotVector`s behave like normal vectors but avoid any unnecessary cost compared to using an integer index directly. For example, multiplying a matrix with a one-hot vector simply slices out the relevant row of the matrix under the hood.
5353

54+
### Function listing
55+
5456
```@docs
5557
OneHotArrays.onehot
5658
OneHotArrays.onecold

docs/src/datasets.md

Lines changed: 0 additions & 6 deletions
This file was deleted.

docs/src/ecosystem.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# The Julia Ecosystem
1+
# The Julia Ecosystem around Flux
22

33
One of the main strengths of Julia lies in an ecosystem of packages
44
globally providing a rich and consistent user experience.
@@ -49,7 +49,10 @@ Utility tools you're unlikely to have met if you never used Flux!
4949

5050
### Datasets
5151

52+
Commonly used machine learning datasets are provided by the following packages in the julia ecosystem:
53+
5254
- [MLDatasets.jl](https://github.com/JuliaML/MLDatasets.jl) focuses on downloading, unpacking, and accessing benchmark datasets.
55+
- [GraphMLDatasets.jl](https://github.com/yuehhua/GraphMLDatasets.jl): a library for machine learning datasets on graph.
5356

5457
### Plumbing
5558

@@ -87,6 +90,7 @@ Packages based on differentiable programming but not necessarily related to Mach
8790

8891
- [OnlineStats.jl](https://github.com/joshday/OnlineStats.jl) provides single-pass algorithms for statistics.
8992

93+
9094
## Useful miscellaneous packages
9195

9296
Some useful and random packages!

docs/src/index.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,3 +18,9 @@ NOTE: Flux used to have a CuArrays.jl dependency until v0.10.4, replaced by CUDA
1818
## Learning Flux
1919

2020
There are several different ways to learn Flux. If you just want to get started writing models, the [model zoo](https://github.com/FluxML/model-zoo/) gives good starting points for many common ones. This documentation provides a reference to all of Flux's APIs, as well as a from-scratch introduction to Flux's take on models and how they work. Once you understand these docs, congratulations, you also understand [Flux's source code](https://github.com/FluxML/Flux.jl), which is intended to be concise, legible and a good reference for more advanced concepts.
21+
22+
## Community
23+
24+
All Flux users are welcome to join our community on the [Julia forum](https://discourse.julialang.org/), or the [slack](https://discourse.julialang.org/t/announcing-a-julia-slack/4866) (channel #machine-learning). If you have questions or issues we'll try to help you out.
25+
26+
If you're interested in hacking on Flux, the [source code](https://github.com/FluxML/Flux.jl) is open and easy to understand -- it's all just the same Julia code you work with normally. You might be interested in our [intro issues](https://github.com/FluxML/Flux.jl/labels/good%20first%20issue) to get started or our [contributing guide](https://github.com/FluxML/Flux.jl/blob/master/CONTRIBUTING.md).

docs/src/models/activation.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
2+
# Activation Functions from NNlib.jl
3+
4+
These non-linearities used between layers of your model are exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package.
5+
6+
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on. Alternatively, they can be passed to a layer like `Dense(784 => 1024, relu)` which will handle this broadcasting.
7+
8+
```@docs
9+
celu
10+
elu
11+
gelu
12+
hardsigmoid
13+
sigmoid_fast
14+
hardtanh
15+
tanh_fast
16+
leakyrelu
17+
lisht
18+
logcosh
19+
logsigmoid
20+
mish
21+
relu
22+
relu6
23+
rrelu
24+
selu
25+
sigmoid
26+
softplus
27+
softshrink
28+
softsign
29+
swish
30+
hardswish
31+
tanhshrink
32+
trelu
33+
```
34+
35+
Julia's `Base.Math` also provide `tanh`, which can be used as an activation function:
36+
37+
```@docs
38+
tanh
39+
```

docs/src/models/advanced.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Advanced Model Building and Customisation
1+
# Defining Customised Layers
22

33
Here we will try and describe usage of some more advanced features that Flux provides to give more control over model building.
44

docs/src/models/layers.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -86,3 +86,12 @@ Many normalisation layers behave differently under training and inference (testi
8686
Flux.testmode!
8787
trainmode!
8888
```
89+
90+
91+
## Listing All Layers
92+
93+
The `modules` command uses Functors to extract a flat list of all layers:
94+
95+
```@docs
96+
Flux.modules
97+
```

0 commit comments

Comments
 (0)