Skip to content

Conversation

thomasjpfan
Copy link
Contributor

@thomasjpfan thomasjpfan commented Dec 31, 2024

Context

What is the purpose of this PR? Is it to

  • add a new feature
  • fix a bug
  • update tests and/or documentation
  • other (Updates config)

Fixes #1993

I ran the following script to find the recipes the support clip_grad_norm and update the yaml files.

from torchtune._recipe_registry import get_all_recipes
import torchtune
import os
from pathlib import Path

recipes = get_all_recipes()
recipes_dir = Path(torchtune.__path__[0]).parent / "recipes"
config_dir = recipes_dir / "configs"

support_clip_grads = [
    recipe
    for recipe in recipes
    if "clip_grad_norm" in (recipes_dir / recipe.file_path).read_text()
]

for recipe in support_clip_grads:
    for config in recipe.configs:
        if config.file_path.startswith("dev"):
            yaml_config = recipes_dir / config.file_path
        else:
            yaml_config = config_dir / config.file_path

        assert yaml_config.exists(), yaml_config

        yaml_text = yaml_config.read_text()

        if "loss" not in yaml_text:
            continue

        if "clip_grad_norm" in yaml_text:
            continue

        assert "compile" in yaml_text

        text = yaml_text.split(os.linesep)

        compile_idx = None
        for idx, line in enumerate(text):
            if line.startswith("compile"):
                compile_idx = idx
                break
        assert compile_idx is not None

        text.insert(compile_idx, "clip_grad_norm: null")

        result = os.linesep.join(text)
        with yaml_config.open("w") as f:
            f.write(result)

Changelog

What are the changes made in this PR?

  • Adds clip_grad_norm to all recipes that support it.

Test plan

Please make sure to do each of the following if applicable to your PR. If you're unsure about any one of these just ask and we will happily help. We also have a contributing page for some guidance on contributing.

  • run pre-commit hooks and linters (make sure you've first installed via pre-commit install)
  • add unit tests for any new functionality
  • update docstrings for any new or updated methods or classes
  • run unit tests via pytest tests
  • run recipe tests via pytest tests -m integration_test
  • manually run any new or modified recipes with sufficient proof of correctness
  • include relevant commands and any other artifacts in this summary (pastes of loss curves, eval results, etc.)

UX

If your function changed a public API, please add a dummy example of what the user experience will look like when calling it.
Here is a docstring example
and a tutorial example

  • I did not change any public API
  • I have added an example to docs or docstrings

Copy link

pytorch-bot bot commented Dec 31, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/torchtune/2220

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit c9b9c4e with merge base 5d1866f (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 31, 2024
@RdoubleA RdoubleA requested a review from felipemello1 January 1, 2025 02:00
Copy link
Member

@joecummings joecummings left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! Thanks for doing this :)

@ebsmothers
Copy link
Contributor

Maybe I am missing something but I don't see gradient clipping support in DPO, PPO, or distributed KD recipes. Can we remove the clip_grad_norm fields from those configs? (Separately we should probably enable for KD at the very least but can likely do that separately)

@thomasjpfan
Copy link
Contributor Author

thomasjpfan commented Jan 10, 2025

@ebsmothers Thanks for catching this! There was a bug in my script for finding the recipes that support gradient clipping. I updated this PR to fix the issue.

Copy link
Contributor

@ebsmothers ebsmothers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @thomasjpfan this looks good! Once CI is green I'll merge it

@ebsmothers ebsmothers merged commit b68cddd into meta-pytorch:main Jan 10, 2025
17 checks passed
Ankur-singh pushed a commit to Ankur-singh/torchtune that referenced this pull request Jan 18, 2025
@RdoubleA RdoubleA mentioned this pull request Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Gradient clipping is not visible in our configs/docs

4 participants