Skip to content

Commit 0dacc7b

Browse files
committed
Use @autodocs and fix references
1 parent c90695d commit 0dacc7b

28 files changed

+55
-75
lines changed

docs/make.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ makedocs(modules = [Metalhead, Artifacts, LazyArtifacts, Images, OneHotArrays, D
1111
],
1212
"Developer guide" => "contributing.md",
1313
"API reference" => [
14-
"api/models.md",
14+
"api/reference.md",
1515
],
1616
],
1717
format = Documenter.HTML(

docs/src/api/models.md

Lines changed: 0 additions & 30 deletions
This file was deleted.

docs/src/api/reference.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
# API Reference
2+
3+
The API reference of `Metalhead.jl`.
4+
5+
**Note**:
6+
7+
```@autodocs
8+
Modules = [Metalhead]
9+
```
10+
11+
```@docs
12+
Metalhead.squeeze_excite
13+
Metalhead.LayerScale
14+
```

docs/src/contributing.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ To add a new model architecture to Metalhead.jl, you can [open a PR](https://git
1616

1717
- reuse layers from Flux as much as possible (e.g. use `Parallel` before defining a `Bottleneck` struct)
1818
- adhere as closely as possible to a reference such as a published paper (i.e. the structure of your model should follow intuitively from the paper)
19-
- use generic functional builders (e.g. [`resnet`](#) is the core function that builds "ResNet-like" models)
19+
- use generic functional builders (e.g. [`Metalhead.resnet`](@ref) is the core function that builds "ResNet-like" models)
2020
- use multiple dispatch to add convenience constructors that wrap your functional builder
2121

2222
When in doubt, just open a PR! We are more than happy to help review your code to help it align with the rest of the library. After adding a model, you might consider adding some pre-trained weights (see below).

docs/src/tutorials/quickstart.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
using Flux, Metalhead
55
```
66

7-
Using a model from Metalhead is as simple as selecting a model from the table of [available models](#). For example, below we use the pre-trained ResNet-18 model.
7+
Using a model from Metalhead is as simple as selecting a model from the table of [available models](@ref API-Reference). For example, below we use the pre-trained ResNet-18 model.
88
```julia
99
using Flux, Metalhead
1010

src/convnets/alexnet.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ Create a `AlexNet`.
4444
4545
`AlexNet` does not currently support pretrained weights.
4646
47-
See also [`alexnet`](#).
47+
See also [`alexnet`](@ref).
4848
"""
4949
struct AlexNet
5050
layers::Any

src/convnets/convnext.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Creates a single block of ConvNeXt.
88
99
- `planes`: number of input channels.
1010
- `drop_path_rate`: Stochastic depth rate.
11-
- `layerscale_init`: Initial value for [`LayerScale`](#)
11+
- `layerscale_init`: Initial value for [`Metalhead.LayerScale`](@ref)
1212
"""
1313
function convnextblock(planes::Integer, drop_path_rate = 0.0, layerscale_init = 1.0f-6)
1414
layers = SkipConnection(Chain(DepthwiseConv((7, 7), planes => planes; pad = 3),
@@ -34,7 +34,7 @@ Creates the layers for a ConvNeXt model.
3434
- `depths`: list with configuration for depth of each block
3535
- `planes`: list with configuration for number of output channels in each block
3636
- `drop_path_rate`: Stochastic depth rate.
37-
- `layerscale_init`: Initial value for [`LayerScale`](#)
37+
- `layerscale_init`: Initial value for [`Metalhead.LayerScale`](@ref)
3838
([reference](https://arxiv.org/abs/2103.17239))
3939
- `inchannels`: number of input channels.
4040
- `nclasses`: number of output classes
@@ -87,7 +87,7 @@ Creates a ConvNeXt model.
8787
- `inchannels`: The number of channels in the input.
8888
- `nclasses`: number of output classes
8989
90-
See also [`Metalhead.convnext`](#).
90+
See also [`Metalhead.convnext`](@ref).
9191
"""
9292
struct ConvNeXt
9393
layers::Any

src/convnets/densenet.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ Create a DenseNet model
6464
6565
- `inplanes`: the number of input feature maps to the first dense block
6666
- `growth_rates`: the growth rates of output feature maps within each
67-
[`dense_block`](#) (a vector of vectors)
67+
[`dense_block`](@ref) (a vector of vectors)
6868
- `reduction`: the factor by which the number of feature maps is scaled across each transition
6969
- `nclasses`: the number of output classes
7070
"""
@@ -122,7 +122,7 @@ Set `pretrain = true` to load the model with pre-trained weights for ImageNet.
122122
123123
`DenseNet` does not currently support pretrained weights.
124124
125-
See also [`Metalhead.densenet`](#).
125+
See also [`Metalhead.densenet`](@ref).
126126
"""
127127
struct DenseNet
128128
layers::Any

src/convnets/efficientnet.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ const EFFICIENTNET_GLOBAL_CONFIGS = Dict(:b0 => (224, (1.0, 1.0)),
8888
EfficientNet(config::Symbol; pretrain::Bool = false)
8989
9090
Create an EfficientNet model ([reference](https://arxiv.org/abs/1905.11946v5)).
91-
See also [`efficientnet`](#).
91+
See also [`efficientnet`](@ref).
9292
9393
# Arguments
9494

src/convnets/inception/googlenet.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ Create an Inception-v1 model (commonly referred to as `GoogLeNet`)
7171
7272
`GoogLeNet` does not currently support pretrained weights.
7373
74-
See also [`googlenet`](#).
74+
See also [`googlenet`](@ref).
7575
"""
7676
struct GoogLeNet
7777
layers::Any

0 commit comments

Comments
 (0)