Skip to content

Commit 94be20f

Browse files
darsnackToucheSir
andauthored
More informative explanation from review
Co-authored-by: Brian Chen <ToucheSir@users.noreply.github.com>
1 parent ad83666 commit 94be20f

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

src/layers/normalise.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -51,8 +51,8 @@ end
5151
5252
Dropout layer. In the forward pass, apply the [`Flux.dropout`](@ref) function on the input.
5353
54-
For N-D dropout layers (e.g. `Dropout2d` or `Dropout3d` in PyTorch),
55-
specify the `dims` keyword (i.e. `Dropout(p; dims = 3)` is a 2D dropout layer).
54+
To apply dropout along an certain dimension (e.g. zeroing out an entire channel's feature map),
55+
specify the `dims` keyword (i.e. `Dropout(p; dims = 3)` is a 2D dropout layer on WHCN input).
5656
5757
Does nothing to the input once [`Flux.testmode!`](@ref) is `true`.
5858
"""
@@ -423,4 +423,4 @@ function Base.show(io::IO, l::GroupNorm)
423423
print(io, "GroupNorm($(join(size(l.β), ", "))")
424424
(l.λ == identity) || print(io, ", λ = $(l.λ)")
425425
print(io, ")")
426-
end
426+
end

0 commit comments

Comments
 (0)