Skip to content

Commit c881068

Browse files
chore: update submodules (#189)
Co-authored-by: ydcjeff <ydcjeff@users.noreply.github.com>
1 parent b908cda commit c881068

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

src/how-to-guides/03-time-profiling.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ This example demonstrates how you can get the time breakdown for:
1616
- Individual epochs during training
1717
- Total training time
1818
- Individual [`Events`](https://pytorch-ignite.ai/concepts/02-events-and-handlers#events)
19-
- All [`Handlers`](https://pytorch-ignite.ai/concepts/02-events-and-handlers#handlers) correspoding to an `Event`
19+
- All [`Handlers`](https://pytorch-ignite.ai/concepts/02-events-and-handlers#handlers) corresponding to an `Event`
2020
- Individual `Handlers`
2121
- Data loading and Data processing.
2222

src/how-to-guides/04-fastai-lr-finder.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@ optimizer = torch.optim.RMSprop(model.parameters(), lr=1e-06)
7272
criterion = nn.CrossEntropyLoss()
7373
```
7474

75-
We will first train the model with a fixed learning rate (lr) of 1e-06 and inspect our results. Let's save the initial state of the model and the optimizer to restore them later for comparision.
75+
We will first train the model with a fixed learning rate (lr) of 1e-06 and inspect our results. Let's save the initial state of the model and the optimizer to restore them later for comparison.
7676

7777

7878
```python

0 commit comments

Comments
 (0)