Skip to content

Commit 67ac0b1

Browse files
committed
move NNlib's activation functions to their own page
1 parent d31694d commit 67ac0b1

File tree

3 files changed

+41
-32
lines changed

3 files changed

+41
-32
lines changed

docs/make.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ makedocs(
1919
"Regularisation" => "models/regularisation.md",
2020
"Advanced Model Building" => "models/advanced.md",
2121
"NNlib.jl" => "models/nnlib.md",
22+
"Activation Functions" => "models/activation.md",
2223
"Functors.jl" => "models/functors.md",
2324
],
2425
"Handling Data" => [

docs/src/models/activation.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
2+
# Activation Functions from NNlib.jl
3+
4+
These non-linearities used between layers of your model are exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package.
5+
6+
Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on. Alternatively, they can be passed to a layer like `Dense(784 => 1024, relu)` which will handle this broadcasting.
7+
8+
```@docs
9+
celu
10+
elu
11+
gelu
12+
hardsigmoid
13+
sigmoid_fast
14+
hardtanh
15+
tanh_fast
16+
leakyrelu
17+
lisht
18+
logcosh
19+
logsigmoid
20+
mish
21+
relu
22+
relu6
23+
rrelu
24+
selu
25+
sigmoid
26+
softplus
27+
softshrink
28+
softsign
29+
swish
30+
hardswish
31+
tanhshrink
32+
trelu
33+
```
34+
35+
Julia's `Base.Math` also provide `tanh`, which can be used as an activation function:
36+
37+
```@docs
38+
tanh
39+
```

docs/src/models/nnlib.md

Lines changed: 1 addition & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,6 @@
11
# Neural Network primitives from NNlib.jl
22

3-
Flux re-exports all of the functions exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package.
4-
5-
## Activation Functions
6-
7-
Non-linearities that go between layers of your model. Note that, unless otherwise stated, activation functions operate on scalars. To apply them to an array you can call `σ.(xs)`, `relu.(xs)` and so on.
8-
9-
```@docs
10-
celu
11-
elu
12-
gelu
13-
hardsigmoid
14-
sigmoid_fast
15-
hardtanh
16-
tanh_fast
17-
leakyrelu
18-
lisht
19-
logcosh
20-
logsigmoid
21-
mish
22-
relu
23-
relu6
24-
rrelu
25-
selu
26-
sigmoid
27-
softplus
28-
softshrink
29-
softsign
30-
swish
31-
hardswish
32-
tanhshrink
33-
trelu
34-
```
3+
Flux re-exports all of the functions exported by the [NNlib](https://github.com/FluxML/NNlib.jl) package. This includes activation functions, described on the next page. Many of the functions on this page exist primarily as the internal implementation of Flux layer, but can also be used independently.
354

365
## Softmax
376

0 commit comments

Comments
 (0)