Skip to content

Commit d13e52a

Browse files
committed
fix doctests
1 parent 0d6619a commit d13e52a

File tree

2 files changed

+8
-8
lines changed

2 files changed

+8
-8
lines changed

src/Optimisers.jl

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@ or [`update!`](@ref).
7777
julia> m = (x = rand(3), y = (true, false), z = tanh);
7878
7979
julia> Optimisers.setup(Momentum(), m) # same field names as m
80-
(x = Leaf(Momentum{Float32}(0.01, 0.9), [0.0, 0.0, 0.0]), y = (nothing, nothing), z = nothing)
80+
(x = Leaf(Momentum{Float32}(0.01, 0.9), [0.0, 0.0, 0.0]), y = ((), ()), z = ())
8181
```
8282
8383
The recursion into structures uses Functors.jl, and any new `struct`s containing parameters
@@ -90,15 +90,15 @@ julia> struct Layer; mat; fun; end
9090
julia> model = (lay = Layer([1 2; 3 4f0], sin), vec = [5, 6f0]);
9191
9292
julia> Optimisers.setup(Momentum(), model) # new struct is by default ignored
93-
(lay = nothing, vec = Leaf(Momentum{Float32}(0.01, 0.9), Float32[0.0, 0.0]))
93+
(lay = (), vec = Leaf(Momentum{Float32}(0.01, 0.9), Float32[0.0, 0.0]))
9494
9595
julia> destructure(model)
9696
(Float32[5.0, 6.0], Restructure(NamedTuple, ..., 2))
9797
9898
julia> using Functors; @functor Layer # annotate this type as containing parameters
9999
100100
julia> Optimisers.setup(Momentum(), model)
101-
(lay = (mat = Leaf(Momentum{Float32}(0.01, 0.9), Float32[0.0 0.0; 0.0 0.0]), fun = nothing), vec = Leaf(Momentum{Float32}(0.01, 0.9), Float32[0.0, 0.0]))
101+
(lay = (mat = Leaf(Momentum{Float32}(0.01, 0.9), Float32[0.0 0.0; 0.0 0.0]), fun = ()), vec = Leaf(Momentum{Float32}(0.01, 0.9), Float32[0.0, 0.0]))
102102
103103
julia> destructure(model)
104104
(Float32[1.0, 3.0, 2.0, 4.0, 5.0, 6.0], Restructure(NamedTuple, ..., 6))
@@ -120,12 +120,12 @@ See also [`update!`](@ref), which will be faster for models of ordinary `Array`s
120120
julia> m = (x = Float32[1,2,3], y = tanh);
121121
122122
julia> t = Optimisers.setup(Descent(0.1f0), m)
123-
(x = Leaf(Descent{Float32}(0.1), nothing), y = nothing)
123+
(x = Leaf(Descent{Float32}(0.1), nothing), y = ())
124124
125125
julia> g = (x = [1,1,1], y = nothing); # fake gradient
126126
127127
julia> Optimisers.update(t, m, g)
128-
((x = Leaf(Descent{Float32}(0.1), nothing), y = nothing), (x = Float32[0.9, 1.9, 2.9], y = tanh))
128+
((x = Leaf(Descent{Float32}(0.1), nothing), y = ()), (x = Float32[0.9, 1.9, 2.9], y = tanh))
129129
```
130130
"""
131131
update

src/adjust.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,15 +13,15 @@ To change just the learning rate, provide a number `η::Real`.
1313
julia> m = (vec = rand(Float32, 2), fun = sin);
1414
1515
julia> st = Optimisers.setup(Nesterov(), m) # stored momentum is initialised to zero
16-
(vec = Leaf(Nesterov{Float32}(0.001, 0.9), Float32[0.0, 0.0]), fun = nothing)
16+
(vec = Leaf(Nesterov{Float32}(0.001, 0.9), Float32[0.0, 0.0]), fun = ())
1717
1818
julia> st, m = Optimisers.update(st, m, (vec = [16, 88], fun = nothing)); # with fake gradient
1919
2020
julia> st
21-
(vec = Leaf(Nesterov{Float32}(0.001, 0.9), Float32[-0.016, -0.088]), fun = nothing)
21+
(vec = Leaf(Nesterov{Float32}(0.001, 0.9), Float32[-0.016, -0.088]), fun = ())
2222
2323
julia> st = Optimisers.adjust(st, 0.123) # change learning rate, stored momentum untouched
24-
(vec = Leaf(Nesterov{Float32}(0.123, 0.9), Float32[-0.016, -0.088]), fun = nothing)
24+
(vec = Leaf(Nesterov{Float32}(0.123, 0.9), Float32[-0.016, -0.088]), fun = ())
2525
```
2626
2727
To change other parameters, `adjust` also accepts keyword arguments matching the field

0 commit comments

Comments
 (0)