Skip to content

Commit 1279ba4

Browse files
committed
Add review comments
1 parent be41a43 commit 1279ba4

File tree

1 file changed

+9
-6
lines changed

1 file changed

+9
-6
lines changed

docs/src/training/optimisers.md

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -145,21 +145,24 @@ First, we import ParameterSchedulers.jl and initalize a cosine annealing schedul
145145
```julia
146146
using ParameterSchedulers
147147

148-
schedule = Cos(λ0 = 1e-4, λ1 = 1e-2, period = 10)
148+
schedule = ScheduleIterator(Cos(λ0 = 1e-4, λ1 = 1e-2, period = 10))
149149
opt = Momentum()
150150
```
151151

152-
Next, you can use your schedule directly in a `for`-loop like any iterator:
152+
Next, you can use your schedule directly in a `for`-loop:
153153
```julia
154-
for (eta, epoch) in zip(schedule, 1:100)
155-
opt.eta = eta
154+
for epoch in 1:100
155+
opt.eta = next!(schedule)
156156
# your training code here
157157
end
158158
```
159159

160-
Alternatively, use `ScheduledOptim` from ParameterSchedulers.jl to wrap the optimiser and schedule into a single object that behaves like any Flux optimiser.
160+
`schedule` can also be indexed (e.g. `schedule[100]`) or iterated like any iterator in Julia:
161161
```julia
162-
@epochs 100 Flux.train!(loss, ps, data, ScheduledOptim(schedule, opt))
162+
for (eta, epoch) in zip(schedule, 1:100)
163+
opt.eta = eta
164+
# your training code here
165+
end
163166
```
164167

165168
ParameterSchedulers.jl allows for many more scheduling policies including arbitrary functions, looping any function with a given period, or sequences of many schedules. See the ParameterSchedulers.jl documentation for more info.

0 commit comments

Comments
 (0)