Skip to content

Better docs for reexported packages #2046

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Aug 29, 2022
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,12 @@ makedocs(
"Loss Functions" => "models/losses.md",
"Regularisation" => "models/regularisation.md",
"Advanced Model Building" => "models/advanced.md",
"NNlib" => "models/nnlib.md",
"Functors" => "models/functors.md"
"Neural Network primitives from NNlib.jl" => "models/nnlib.md",
"Functor from Functors.jl" => "models/functors.md"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One more thought. Should Zygote.jl be among the packages which gets a sidebar heading?

],
"Handling Data" => [
"One-Hot Encoding" => "data/onehot.md",
"MLUtils" => "data/mlutils.md"
"One-Hot Encoding with OneHotArrays.jl" => "data/onehot.md",
"Working with data using MLUtils.jl" => "data/mlutils.md"
],
"Training Models" => [
"Optimisers" => "training/optimisers.md",
Expand Down
2 changes: 1 addition & 1 deletion docs/src/data/mlutils.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# MLUtils.jl
# Working with data using MLUtils.jl
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't comment below, but flatten appears on this page (as it should), and also here:

https://fluxml.ai/Flux.jl/latest/models/layers/#Flux.flatten

Should it be removed?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, yes! Thanks!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correction: MLUtils.unsqueeze cross-references MLUtils.flatten, and the doctests fail if I remove MLUtils.flatten's reference from the docs.


Flux re-exports the `DataLoader` type and utility functions for working with
data from [MLUtils](https://github.com/JuliaML/MLUtils.jl).
Expand Down
2 changes: 1 addition & 1 deletion docs/src/models/functors.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Functors.jl
# Functor from Functors.jl

Flux makes use of the [Functors.jl](https://github.com/FluxML/Functors.jl) to represent many of the core functionalities it provides.

Copy link
Member

@mcabbott mcabbott Aug 22, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems a bit vague to me... first, this use of "functor" belongs to a tiny nice of functional programming, I don't think it's worth putting any emphasis on that here, it's just some package with a weird name. (I mean the title, and the paragraph just below where I can comment.)

Second, I don't know what "many of the core functionalities" means. I wrote something but maybe giving examples (params, training, gpu) would be nice too.

Suggested change
# Functor from Functors.jl
Flux makes use of the [Functors.jl](https://github.com/FluxML/Functors.jl) to represent many of the core functionalities it provides.
# Recursive transformations from Functors.jl
Flux models are deeply nested structures, and [Functors.jl](https://github.com/FluxML/Functors.jl) provides tools needed to explore such objects, apply functions to the parameters they contain, and re-build them.

I can't make suggestions below this, but (IMO) it might also be worth separating the list of functions according to level of obscurity. Maybe @functor, functor, isleaf, fmap should be in one block, the rest in another? Or just @functor, fmap as the top? Not sure, maybe it's too messy to make such a division.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The number of docstrings looks too few for a further division; should I still divide them?

Copy link
Member

@ToucheSir ToucheSir Aug 24, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would think of and describe functors like you would the module system of any other library. Because while it is a general-purpose library, that's what we use it for in Flux. One way to do this could be to show practical examples of using @functor etc. to define layers, and then explain what is happening along with why things are done this way. If that last part is too much, just the examples, module system mention and a link to the advanced model building page could suffice.

Expand Down
125 changes: 85 additions & 40 deletions docs/src/models/nnlib.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# NNlib.jl
# Neural Network primitives from NNlib.jl

Flux re-exports all of the functions exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package.

Expand All @@ -7,82 +7,127 @@ Flux re-exports all of the functions exported by the [NNlib](https://github.com/
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.

```@docs
NNlib.celu
NNlib.elu
NNlib.gelu
NNlib.hardsigmoid
NNlib.sigmoid_fast
NNlib.hardtanh
NNlib.tanh_fast
NNlib.leakyrelu
NNlib.lisht
NNlib.logcosh
NNlib.logsigmoid
NNlib.mish
NNlib.relu
NNlib.relu6
NNlib.rrelu
NNlib.selu
NNlib.sigmoid
NNlib.softplus
NNlib.softshrink
NNlib.softsign
NNlib.swish
NNlib.tanhshrink
NNlib.trelu
celu
elu
gelu
hardsigmoid
sigmoid_fast
hardtanh
tanh_fast
leakyrelu
lisht
logcosh
logsigmoid
mish
relu
relu6
rrelu
selu
sigmoid
softplus
softshrink
softsign
swish
hardswish
tanhshrink
trelu
```

## Softmax

`Flux`'s `logitcrossentropy` uses `NNlib.softmax` internally.

```@docs
NNlib.softmax
NNlib.logsoftmax
softmax
logsoftmax
```

## Pooling

`Flux`'s `AdaptiveMaxPool`, `AdaptiveMeanPool`, `GlobalMaxPool`, `GlobalMeanPool`, `MaxPool`, and `MeanPool` use `NNlib.PoolDims`, `NNlib.maxpool`, and `NNlib.meanpool` as their backend.

```@docs
NNlib.maxpool
NNlib.meanpool
PoolDims
maxpool
meanpool
```

## Padding

```@docs
pad_reflect
pad_constant
pad_repeat
pad_zeros
```

## Convolution

`Flux`'s `Conv` and `CrossCor` layers use `NNlib.DenseConvDims` and `NNlib.conv` internally.

```@docs
NNlib.conv
NNlib.depthwiseconv
conv
ConvDims
depthwiseconv
DepthwiseConvDims
DenseConvDims
```

## Upsampling

`Flux`'s `Upsample` layer uses `NNlib.upsample_nearest`, `NNlib.upsample_bilinear`, and `NNlib.upsample_trilinear` as its backend. Additionally, `Flux`'s `PixelShuffle` layer uses `NNlib.pixel_shuffle` as its backend.

```@docs
NNlib.upsample_nearest
NNlib.upsample_bilinear
NNlib.upsample_trilinear
NNlib.pixel_shuffle
NNlib.grid_sample
upsample_nearest
∇upsample_nearest
upsample_linear
∇upsample_linear
upsample_bilinear
∇upsample_bilinear
upsample_trilinear
∇upsample_trilinear
pixel_shuffle
```

## Batched Operations

`Flux`'s `Bilinear` layer uses `NNlib.batched_mul` internally.

```@docs
NNlib.batched_mul
NNlib.batched_mul!
NNlib.batched_adjoint
NNlib.batched_transpose
batched_mul
batched_mul!
batched_adjoint
batched_transpose
batched_vec
```

## Gather and Scatter

`Flux`'s `Embedding` layer uses `NNlib.gather` as its backend.

```@docs
NNlib.gather
NNlib.gather!
NNlib.scatter
NNlib.scatter!
```

## Sampling

```@docs
grid_sample
∇grid_sample
```

## Losses

```@docs
ctc_loss
```

## Miscellaneous

```@docs
NNlib.logsumexp
logsumexp
NNlib.glu
```