Skip to content

Commit f922c16

Browse files
authored
Fix docstrings
1 parent 15ac6cd commit f922c16

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/layers/normalise.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ function _dropout_mask(rng, x, p; dims=:)
5555
end
5656

5757
"""
58-
Dropout(p; dims=:, rng = default_rng())
58+
Dropout(p; dims=:, rng = rng_from_array())
5959
6060
Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input.
6161
@@ -100,7 +100,7 @@ function Base.show(io::IO, d::Dropout)
100100
end
101101

102102
"""
103-
AlphaDropout(p; rng = default_rng())
103+
AlphaDropout(p; rng = rng_from_array())
104104
105105
A dropout layer. Used in
106106
[Self-Normalizing Neural Networks](https://arxiv.org/abs/1706.02515).

0 commit comments

Comments
 (0)