Skip to content

Commit 108cbc8

Browse files
Merge #1612
1612: fix AdamW and improve decays docs r=DhairyaLGandhi a=CarloLucibello There is great disorder under the sky with optimizers. Since in chaining optimizers ``` opt = Optimizer(opt1, opt2) ``` the order generally matters (a lot!) we have to be very careful in documenting how to use decays. In fact, we were giving completely wrong indirections for `InvDecays` and `ExpDecays`. The correct ordering for standard use is ```julia Optimizer(WeightDecay(), ADAM()) # equivalent to L2 regularization Optimizer(ADAM(), InvDecay()) # learning rate scheduling Optimizer(ADAM(), ExpDecay()) # learning rate scheduling ``` Different orderings are to be typically considered as bugs in user code. This PR fixes examples and tries to clarify documentation in this regard. Also fixes AdamW, which was doing something totally wrong due to the aforementioned confusion. (see https://towardsdatascience.com/why-adamw-matters-736223f31b5d for how AdamW works). Related in model-zoo: FluxML/model-zoo#303 and FluxML/model-zoo#304 Co-authored-by: CarloLucibello <carlo.lucibello@gmail.com> Co-authored-by: Carlo Lucibello <carlo.lucibello@unibocconi.it>
2 parents 3b7895e + 380ca76 commit 108cbc8

File tree

1 file changed

+31
-11
lines changed

1 file changed

+31
-11
lines changed

src/optimise/optimisers.jl

Lines changed: 31 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -491,7 +491,7 @@ opt = ADAMW(0.001, (0.89, 0.995), 0.1)
491491
```
492492
"""
493493
ADAMW= 0.001, β = (0.9, 0.999), decay = 0) =
494-
Optimiser(ADAM(η, β), WeightDecay(decay))
494+
Optimiser(ADAM(1, β), WeightDecay(decay), Descent))
495495

496496
"""
497497
AdaBelief(η = 0.001, β::Tuple = (0.9, 0.999))
@@ -564,9 +564,18 @@ Apply inverse time decay to an optimiser, so that the effective step size at
564564
iteration `n` is `eta / (1 + γ * n)` where `eta` is the initial step size.
565565
The wrapped optimiser's step size is not modified.
566566
567+
See also the [Scheduling Optimisers](@ref) section of the docs
568+
for more general scheduling techniques.
569+
567570
# Examples
571+
572+
`InvDecay` is typically composed with other optimizers
573+
as the last transformation of the gradient:
574+
568575
```julia
569-
Optimiser(InvDecay(..), Opt(..))
576+
# Inverse decay of the learning rate
577+
# with starting value 0.001 and decay coefficient 0.01.
578+
opt = Optimiser(Adam(1f-3), InvDecay(1f-2))
570579
```
571580
"""
572581
mutable struct InvDecay <: AbstractOptimiser
@@ -598,12 +607,16 @@ a minimum of `clip`.
598607
two decay operations.
599608
- `clip`: Minimum value of learning rate.
600609
610+
611+
See also the [Scheduling Optimisers](@ref) section of the docs
612+
for more general scheduling techniques.
613+
601614
# Examples
602-
To apply exponential decay to an optimiser:
603-
```julia
604-
Optimiser(ExpDecay(..), Opt(..))
605615
606-
opt = Optimiser(ExpDecay(), ADAM())
616+
`ExpDecay` is typically composed with other optimizers
617+
as the last transformation of the gradient:
618+
```julia
619+
opt = Optimiser(ADAM(), ExpDecay())
607620
```
608621
"""
609622
mutable struct ExpDecay <: AbstractOptimiser
@@ -614,7 +627,8 @@ mutable struct ExpDecay <: AbstractOptimiser
614627
current::IdDict
615628
end
616629

617-
ExpDecay(opt = 0.001, decay = 0.1, decay_step = 1000, clip = 1e-4) = ExpDecay(opt, decay, decay_step, clip, IdDict())
630+
ExpDecay(opt = 0.001, decay = 0.1, decay_step = 1000, clip = 1e-4) =
631+
ExpDecay(opt, decay, decay_step, clip, IdDict())
618632

619633
function apply!(o::ExpDecay, x, Δ)
620634
η, s, decay = o.eta, o.step, o.decay
@@ -627,12 +641,18 @@ function apply!(o::ExpDecay, x, Δ)
627641
end
628642

629643
"""
630-
WeightDecay(wd = 0)
644+
WeightDecay(λ = 0)
631645
632-
Decay weights by `wd`.
646+
Decay weights by ``λ``.
647+
Typically composed with other optimizers as the first transformation to the gradient,
648+
making it equivalent to adding ``L_2`` regularization
649+
with coefficient ``λ`` to the loss.
633650
634-
# Parameters
635-
- Weight decay (`wd`)
651+
# Examples
652+
653+
```julia
654+
opt = Optimiser(WeigthDecay(1f-4), ADAM())
655+
```
636656
"""
637657
mutable struct WeightDecay <: AbstractOptimiser
638658
wd::Real

0 commit comments

Comments
 (0)