Skip to content

Commit 9b716fc

Browse files
Doc update (saving.md): removed outdated info; Typo fix.
1 parent ea26f45 commit 9b716fc

File tree

2 files changed

+2
-8
lines changed

2 files changed

+2
-8
lines changed

CONTRIBUTING.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ One of the best ways to contribute is by looking at issues labeled ["help wanted
88

99
## Good First Issues
1010

11-
While there are not many right now, we do have a section for ["good for issues"](https://github.com/FluxML/Flux.jl/labels/good%20first%20issue). As mentioned above, if any of these seem interesting but there is no clear next step in your mind, please feel free to ask for a suggested step. Often times in open source, issues labeled as "good first issue" actually take some back and forth between maintainers and contributors before the issues is ready to be tackled by a new contributor.
11+
While there are not many right now, we do have a section for ["good first issues"](https://github.com/FluxML/Flux.jl/labels/good%20first%20issue). As mentioned above, if any of these seem interesting but there is no clear next step in your mind, please feel free to ask for a suggested step. Often times in open source, issues labeled as "good first issue" actually take some back and forth between maintainers and contributors before the issues is ready to be tackled by a new contributor.
1212

1313
## Model Zoo
1414

docs/src/saving.md

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -118,10 +118,4 @@ revert to an older copy of the model if it starts to overfit.
118118
@save "model-$(now()).bson" model loss = testloss()
119119
```
120120

121-
You can even store optimiser state alongside the model, to resume training
122-
exactly where you left off.
123-
124-
```julia
125-
opt = ADAM()
126-
@save "model-$(now()).bson" model opt
127-
```
121+
Note that to resume a model's training, you might need to restore other stateful parts of your training loop. Possible examples are stateful optimizers (which usually utilize an `IdDict` to store their state, which is not automatically handled by `BSON`), and the randomness used to partition the original data into the training and validation sets.

0 commit comments

Comments
 (0)