Skip to content

Commit 4132da4

Browse files
fix broken doc links (#170)
* fix broken doc links * typo * docstrings
1 parent 78b2141 commit 4132da4

File tree

3 files changed

+16
-16
lines changed

3 files changed

+16
-16
lines changed

docs/src/index.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
## An optimisation rule
44

5-
A new optimiser must overload two functions, [`apply!`](@ref) and [`init`](@ref).
5+
A new optimiser must overload two functions, [`apply!`](@ref Optimisers.apply!) and [`init`](@ref Optimisers.init).
66
These act on one array of parameters:
77

88
```julia
@@ -33,8 +33,8 @@ It of course also makes it easier to store the state.
3333

3434
## Usage with [Flux.jl](https://github.com/FluxML/Flux.jl)
3535

36-
To apply such an optimiser to a whole model, [`setup`](@ref) builds a tree containing any initial
37-
state for every trainable array. Then at each step, [`update`](@ref) uses this and the gradient
36+
To apply such an optimiser to a whole model, [`setup`](@ref Optimisers.setup) builds a tree containing any initial
37+
state for every trainable array. Then at each step, [`update`](@ref Optimisers.update) uses this and the gradient
3838
to adjust the model:
3939

4040
```julia
@@ -142,10 +142,10 @@ end;
142142

143143
Optimisers.jl uses [Functors.jl](https://fluxml.ai/Functors.jl) to walk the `struct`s
144144
making up the model, for which they must be annotated `@functor Type`.
145-
By default optimisation will alter all [`isnumeric`](@ref) arrays.
145+
By default optimisation will alter all [`isnumeric`](@ref Optimisers.isnumeric) arrays.
146146

147147
If some arrays of a particular layer should not be treated this way,
148-
you can define a method for [`trainable`](@ref)
148+
you can define a method for [`trainable`](@ref Optimisers.trainable)
149149

150150
```julia
151151
struct Layer{T}
@@ -239,7 +239,7 @@ from StaticArrays.jl.
239239
## Obtaining a flat parameter vector
240240

241241
Instead of a nested tree-like structure, sometimes is is convenient to have all the
242-
parameters as one simple vector. Optimisers.jl contains a function [`destructure`](@ref)
242+
parameters as one simple vector. Optimisers.jl contains a function [`destructure`](@ref Optimisers.destructure)
243243
which creates this vector, and also creates way to re-build the original structure
244244
with new parameters. Both flattening and re-building may be used within `gradient` calls.
245245

@@ -270,7 +270,7 @@ st, flat = Optimisers.update(st, flat, ∇flat)
270270

271271
Here `flat` contains only the 283 trainable parameters, while the non-trainable
272272
ones are preserved inside `re`, an object of type `Restructure`.
273-
When defining new layers, these can be specified if necessary by overloading [`trainable`](@ref).
273+
When defining new layers, these can be specified if necessary by overloading [`trainable`](@ref Optimisers.trainable).
274274
By default, all numeric arrays visible to [Functors.jl](https://github.com/FluxML/Functors.jl)
275275
are assumed to contain trainable parameters.
276276
Tied parameters (arrays appearing in different layers) are included only once in `flat`.

src/Optimisers.jl

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ apply!
4848
Optimisers.init(rule::RuleType, parameters) -> state
4949
5050
Sets up the initial state for a given optimisation rule, and an array of parameters.
51-
This and [`apply!`](@ref) are the two functions which any new optimisation rule must define.
51+
This and [`apply!`](@ref Optimisers.apply!) are the two functions which any new optimisation rule must define.
5252
5353
# Examples
5454
```jldoctest
@@ -70,8 +70,8 @@ init
7070
Optimisers.setup(rule, model) -> state_tree
7171
7272
Initialises the given optimiser for every trainable parameter within the model.
73-
Returns a tree of the relevant states, which must be passed to [`update`](@ref)
74-
or [`update!`](@ref).
73+
Returns a tree of the relevant states, which must be passed to [`update`](@ref Optimisers.update)
74+
or [`update!`](@ref Optimisers.update!).
7575
7676
# Example
7777
```jldoctest
@@ -112,9 +112,9 @@ setup
112112
113113
Uses the optimiser and the gradient to change the trainable parameters in the model.
114114
Returns the improved model, and the optimiser states needed for the next update.
115-
The initial tree of states comes from [`setup`](@ref).
115+
The initial tree of states comes from [`setup`](@ref Optimisers.setup).
116116
117-
See also [`update!`](@ref), which will be faster for models of ordinary `Array`s or `CuArray`s.
117+
See also [`update!`](@ref Optimisers.update!), which will be faster for models of ordinary `Array`s or `CuArray`s.
118118
119119
# Example
120120
```jldoctest
@@ -136,9 +136,9 @@ update
136136
137137
Uses the optimiser and the gradient to change the trainable parameters in the model.
138138
Returns the improved model, and the optimiser states needed for the next update.
139-
The initial tree of states comes from [`setup`](@ref).
139+
The initial tree of states comes from [`setup`](@ref Optimisers.setup).
140140
141-
This is used in exactly the same manner as [`update`](@ref), but because it may mutate
141+
This is used in exactly the same manner as [`update`](@ref Optimisers.update), but because it may mutate
142142
arrays within the old model (and the old state), it will be faster for models of ordinary
143143
`Array`s or `CuArray`s. However, you should not rely on the old model being fully updated
144144
but rather use the returned model.

src/destructure.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ const NoT = NoTangent()
55
"""
66
destructure(model) -> vector, reconstructor
77
8-
Copies all [`trainable`](@ref), [`isnumeric`](@ref) parameters in the model
8+
Copies all [`trainable`](@ref Optimisers.trainable), [`isnumeric`](@ref Optimisers.isnumeric) parameters in the model
99
to a vector, and returns also a function which reverses this transformation.
1010
Differentiable.
1111
@@ -34,7 +34,7 @@ end
3434
"""
3535
Restructure(Model, ..., length)
3636
37-
This is what [`destructure`](@ref) returns, and `re(p)` will re-build the model with
37+
This is what [`destructure`](@ref Optimisers.destructure) returns, and `re(p)` will re-build the model with
3838
new parameters from vector `p`. If the model is callable, then `re(x, p) == re(p)(x)`.
3939
4040
# Example

0 commit comments

Comments
 (0)