Skip to content

Commit ef69936

Browse files
committed
Remove redundant randomness, add docfilters, and make them stricter for Dropout layer
1 parent 69e996a commit ef69936

File tree

3 files changed

+22
-55
lines changed

3 files changed

+22
-55
lines changed

src/layers/normalise.jl

Lines changed: 6 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ Custom RNGs are only supported on the CPU.
6767
Does nothing to the input once [`Flux.testmode!`](@ref) is `true`.
6868
6969
# Examples
70-
```jldoctest
70+
```jldoctest; filter = r"[+-]?(?:(?:[0-9])(?:\\.\\d+)?)|(?:1)(?:\\.0+)?"
7171
julia> m = Chain(Dense(2 => 2), Dropout(1))
7272
Chain(
7373
Dense(2 => 2), # 6 parameters
@@ -136,14 +136,10 @@ remain the same as before.
136136
Does nothing to the input once [`testmode!`](@ref) is true.
137137
138138
# Examples
139-
```jldoctest
139+
```jldoctest; filter = r"[+-]?([0-9]*[.])?[0-9]+"
140140
julia> x = randn(20,1);
141141
142-
julia> m = Chain(Dense(20 => 10, selu), AlphaDropout(0.5))
143-
Chain(
144-
Dense(20 => 10, selu), # 210 parameters
145-
AlphaDropout{Float64, Random.TaskLocalRNG}(0.5, nothing, Random.TaskLocalRNG()),
146-
)
142+
julia> m = Chain(Dense(20 => 10, selu), AlphaDropout(0.5));
147143
148144
julia> Flux.trainmode!(m);
149145
@@ -212,7 +208,7 @@ using the [`Scale`](@ref) layer.
212208
See also [`BatchNorm`](@ref), [`InstanceNorm`](@ref), [`GroupNorm`](@ref), and [`normalise`](@ref).
213209
214210
# Examples
215-
```jldoctest
211+
```jldoctest; filter = r"[+-]?([0-9]*[.])?[0-9]+"
216212
julia> xs = rand(3, 3, 3, 2); # a batch of 2 3X3X3 images
217213
218214
julia> m = LayerNorm(3);
@@ -423,7 +419,7 @@ that will be used to renormalize the input in test phase.
423419
in previous Flux versions (< v0.12).
424420
425421
# Examples
426-
```jldoctest
422+
```jldoctest; filter = r"[+-]?([0-9]*[.])?[0-9]+"
427423
julia> xs = rand(3, 3, 3, 2); # a batch of 2 3X3X3 images
428424
429425
julia> m = InstanceNorm(3);
@@ -521,7 +517,7 @@ If `track_stats=true`, accumulates mean and var statistics in training phase
521517
that will be used to renormalize the input in test phase.
522518
523519
# Examples
524-
```jldoctest
520+
```jldoctest; filter = r"[+-]?([0-9]*[.])?[0-9]+"
525521
julia> xs = rand(3, 3, 4, 2); # a batch of 2 3X3X4 images
526522
527523
julia> m = GroupNorm(4, 2);

src/layers/recurrent.jl

Lines changed: 11 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -180,31 +180,23 @@ Assuming you have a `Recur` layer `rnn`, this is roughly equivalent to:
180180
rnn.state = hidden(rnn.cell)
181181
182182
# Examples
183-
```jldoctest
184-
julia> r = RNN(3 => 5);
183+
```jldoctest; filter = r"[+-]?([0-9]*[.])?[0-9]+"
184+
julia> r = RNN(1 => 1);
185+
186+
julia> a = Vector{Float32}([1])
187+
1-element Vector{Float32}:
188+
1.0
185189
186190
julia> r.state
187-
5×1 Matrix{Float32}:
188-
0.0
189-
0.0
190-
0.0
191-
0.0
191+
1×1 Matrix{Float32}:
192192
0.0
193193
194-
julia> r(rand(Float32, 3)); r.state
195-
5×1 Matrix{Float32}:
196-
-0.32719195
197-
-0.45280662
198-
-0.50386846
199-
-0.14782222
200-
0.23584609
194+
julia> r(a); r.state
195+
1×1 Matrix{Float32}:
196+
0.61431444
201197
202198
julia> Flux.reset!(r)
203-
5×1 Matrix{Float32}:
204-
0.0
205-
0.0
206-
0.0
207-
0.0
199+
1×1 Matrix{Float32}:
208200
0.0
209201
```
210202
"""

src/layers/upsample.jl

Lines changed: 5 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -81,34 +81,13 @@ resolution images while upscaling them.
8181
See [`NNlib.pixel_shuffle`](@ref).
8282
8383
# Examples
84-
```jldoctest; filter = r"[+-]?([0-9]*[.])?[0-9]+"
84+
```jldoctest
8585
julia> p = PixelShuffle(2);
8686
87-
julia> xs = rand(2, 2, 4, 1) # an image with 4 channels having 2X2 pixels in each channel
88-
2×2×4×1 Array{Float64, 4}:
89-
[:, :, 1, 1] =
90-
0.826452 0.0519244
91-
0.0686387 0.438346
92-
93-
[:, :, 2, 1] =
94-
0.343179 0.445101
95-
0.543927 0.740905
96-
97-
[:, :, 3, 1] =
98-
0.105997 0.422996
99-
0.32957 0.167205
100-
101-
[:, :, 4, 1] =
102-
0.825737 0.98609
103-
0.757365 0.294784
104-
105-
julia> p(xs) # upsampled image with only 1 channel
106-
4×4×1×1 Array{Float64, 4}:
107-
[:, :, 1, 1] =
108-
0.826452 0.105997 0.0519244 0.422996
109-
0.343179 0.825737 0.445101 0.98609
110-
0.0686387 0.32957 0.438346 0.167205
111-
0.543927 0.757365 0.740905 0.294784
87+
julia> xs = rand(2, 2, 4, 1); # an image with 4 channels having 2X2 pixels in each channel
88+
89+
julia> p(xs) |> size # upsampled image with only 1 channel
90+
(4, 4, 1, 1)
11291
```
11392
"""
11493
struct PixelShuffle

0 commit comments

Comments
 (0)