Skip to content

Commit 0301d59

Browse files
authored
📚 Add training from a checkpoint example (#2389)
* Add training from a checkpoint example Signed-off-by: Samet Akcay <samet.akcay@intel.com> * Replace patchcore example with efficient-ad Signed-off-by: Samet Akcay <samet.akcay@intel.com> --------- Signed-off-by: Samet Akcay <samet.akcay@intel.com>
1 parent db4c285 commit 0301d59

File tree

3 files changed

+28
-20
lines changed

3 files changed

+28
-20
lines changed

docs/source/markdown/get_started/anomalib.md

Lines changed: 16 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ The installer can be installed using the following commands:
1717
:::{tab-item} API
1818
:sync: label-1
1919

20-
```{literalinclude} ../../snippets/install/pypi.txt
20+
```{literalinclude} /snippets/install/pypi.txt
2121
:language: bash
2222
```
2323

@@ -26,7 +26,7 @@ The installer can be installed using the following commands:
2626
:::{tab-item} Source
2727
:sync: label-2
2828

29-
```{literalinclude} ../../snippets/install/source.txt
29+
```{literalinclude} /snippets/install/source.txt
3030
:language: bash
3131
```
3232

@@ -42,22 +42,22 @@ The next section demonstrates how to install the full package using the CLI inst
4242
:::::{dropdown} Installing the Full Package
4343
After installing anomalib, you can install the full package using the following commands:
4444

45-
```{literalinclude} ../../snippets/install/anomalib_help.txt
45+
```{literalinclude} /snippets/install/anomalib_help.txt
4646
:language: bash
4747
```
4848

4949
As can be seen above, the only available sub-command is `install` at the moment.
5050
The `install` sub-command has options to install either the full package or the
5151
specific components of the package.
5252

53-
```{literalinclude} ../../snippets/install/anomalib_install_help.txt
53+
```{literalinclude} /snippets/install/anomalib_install_help.txt
5454
:language: bash
5555
```
5656

5757
By default the `install` sub-command installs the full package. If you want to
5858
install only the specific components of the package, you can use the `--option` flag.
5959

60-
```{literalinclude} ../../snippets/install/anomalib_install.txt
60+
```{literalinclude} /snippets/install/anomalib_install.txt
6161
:language: bash
6262
```
6363

@@ -66,21 +66,23 @@ After following these steps, your environment will be ready to use anomalib!
6666

6767
## {octicon}`mortar-board` Training
6868

69-
Anomalib supports both API and CLI-based training. The API is more flexible and allows for more customization, while the CLI training utilizes command line interfaces, and might be easier for those who would like to use anomalib off-the-shelf.
69+
Anomalib supports both API and CLI-based training. The API is more flexible
70+
and allows for more customization, while the CLI training utilizes command line
71+
interfaces, and might be easier for those who would like to use anomalib off-the-shelf.
7072

7173
::::{tab-set}
7274

7375
:::{tab-item} API
7476

75-
```{literalinclude} ../../snippets/train/api/default.txt
77+
```{literalinclude} /snippets/train/api/default.txt
7678
:language: python
7779
```
7880

7981
:::
8082

8183
:::{tab-item} CLI
8284

83-
```{literalinclude} ../../snippets/train/cli/default.txt
85+
```{literalinclude} /snippets/train/cli/default.txt
8486
:language: bash
8587
```
8688

@@ -100,7 +102,7 @@ Anomalib includes multiple inferencing scripts, including Torch, Lightning, Grad
100102
:::{tab-item} API
101103
:sync: label-1
102104

103-
```{literalinclude} ../../snippets/inference/api/lightning.txt
105+
```{literalinclude} /snippets/inference/api/lightning.txt
104106
:language: python
105107
```
106108

@@ -109,7 +111,7 @@ Anomalib includes multiple inferencing scripts, including Torch, Lightning, Grad
109111
:::{tab-item} CLI
110112
:sync: label-2
111113

112-
```{literalinclude} ../../snippets/inference/cli/lightning.txt
114+
```{literalinclude} /snippets/inference/cli/lightning.txt
113115
:language: bash
114116
```
115117

@@ -201,15 +203,15 @@ Anomalib supports hyper-parameter optimization using [wandb](https://wandb.ai/)
201203

202204
:::{tab-item} CLI
203205

204-
```{literalinclude} ../../snippets/pipelines/hpo/cli.txt
206+
```{literalinclude} /snippets/pipelines/hpo/cli.txt
205207
:language: bash
206208
```
207209

208210
:::
209211

210212
:::{tab-item} API
211213

212-
```{literalinclude} ../../snippets/pipelines/hpo/api.txt
214+
```{literalinclude} /snippets/pipelines/hpo/api.txt
213215
:language: bash
214216
```
215217

@@ -233,15 +235,15 @@ To run a training experiment with experiment tracking, you will need the followi
233235

234236
By using the configuration file above, you can run the experiment with the following command:
235237

236-
```{literalinclude} ../../snippets/logging/cli.txt
238+
```{literalinclude} /snippets/logging/cli.txt
237239
:language: bash
238240
```
239241

240242
:::
241243

242244
:::{tab-item} API
243245

244-
```{literalinclude} ../../snippets/logging/api.txt
246+
```{literalinclude} /snippets/logging/api.txt
245247
:language: bash
246248
```
247249

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,15 @@
11
# Import the required modules
22
from anomalib.data import MVTec
3-
from anomalib.models import Patchcore
43
from anomalib.engine import Engine
4+
from anomalib.models import EfficientAd
55

66
# Initialize the datamodule, model and engine
7-
datamodule = MVTec()
8-
model = Patchcore()
9-
engine = Engine()
7+
datamodule = MVTec(train_batch_size=1)
8+
model = EfficientAd()
9+
engine = Engine(max_epochs=5)
1010

1111
# Train the model
1212
engine.fit(datamodule=datamodule, model=model)
13+
14+
# Continue from a checkpoint
15+
engine.fit(datamodule=datamodule, model=model, ckpt_path="path/to/checkpoint.ckpt")

docs/source/snippets/train/cli/default.txt

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,13 @@
22
anomalib train -h
33

44
# Train by using the default values.
5-
anomalib train --model Patchcore --data anomalib.data.MVTec
5+
anomalib train --model EfficientAd --data anomalib.data.MVTec --data.train_batch_size 1
66

77
# Train by overriding arguments.
8-
anomalib train --model Patchcore --data anomalib.data.MVTec --data.category transistor
8+
anomalib train --model EfficientAd --data anomalib.data.MVTec --data.train_batch_size 1 --data.category transistor
99

1010
# Train by using a config file.
1111
anomalib train --config <path/to/config>
12+
13+
# Continue training from a checkpoint
14+
anomalib train --config <path/to/config> --ckpt_path <path/to/checkpoint.ckpt>

0 commit comments

Comments
 (0)