Skip to content

Commit b422463

Browse files
authored
Fix checkpointing docs: Loading a comma-seprated string from toml file is off (#1281)
As title From the test: https://github.com/pytorch/torchtitan/blob/main/tests/unit_tests/test_job_config.py#L123 - Commandline supports comma-seprated string, - TOML file only support TOML native syntax for `list[str]`: `exclude_from_loading = ["optimizer", "lr_scheduler", "dataloader"]`
1 parent 315191e commit b422463

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

docs/checkpoint.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,12 +64,13 @@ python -m torch.distributed.checkpoint.format_utils dcp_to_torch torchtitan/outp
6464

6565
7. EXCLUDING SPECIFIC KEYS FROM CHECKPOINT LOADING
6666
In some cases, you may want to partially load from a previous-trained checkpoint and modify certain settings, such as the number of GPUs or the current step. To achieve this, you can use the `exclude_from_loading` parameter to specify which keys should be excluded from loading.
67-
This parameter takes a comma-separated list of keys that should be excluded from loading.
67+
This parameter takes a list of string that should be excluded from loading.
6868
```
6969
[checkpoint]
7070
enable_checkpoint = true
71-
exclude_from_loading = "data_loader,lr_scheduler"
71+
exclude_from_loading = ["data_loader", "lr_scheduler"]
7272
```
73+
When used in command line, the parameter should be a comma-separated list of strings. For example: `--checkpoint.exclude_from_loading data_loader,lr_scheduler`.
7374

7475
That's it. You have now successfully converted a sharded torchtitan checkpoint for use in torchtune.
7576

0 commit comments

Comments
 (0)