Skip to content

[tests] unbloat tests/lora/utils.py #11845

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 19 commits into
base: main
Choose a base branch
from
Open

Conversation

sayakpaul
Copy link
Member

@sayakpaul sayakpaul commented Jul 1, 2025

What does this PR do?

We take the following approach:

  • Use parameterized to combine similar flavored tests.
  • Modularize repeated blocks of code into functions and use them as much as possible.
  • Remove redundant tests.
  • We make peft>=0.15.0 a mandate. So, I removed @require_peft_version_greater decorator.

In a follow-up PR, I will attempt to improve tests from the LoRA test suite that take the most amount of time.

@@ -103,34 +103,6 @@ def get_dummy_inputs(self, with_generator=True):

return noise, input_ids, pipeline_inputs

@unittest.skip("Not supported in AuraFlow.")
Copy link
Member Author

@sayakpaul sayakpaul Jul 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are skipped appropriately from the parent method. I think it's okay in this case, because it eases things a bit.

@sayakpaul sayakpaul changed the title [wip][tests] unbloat tests/lora/utils.py [tests] unbloat tests/lora/utils.py Jul 3, 2025
@sayakpaul sayakpaul requested a review from BenjaminBossan July 3, 2025 09:32
@sayakpaul sayakpaul marked this pull request as ready for review July 3, 2025 09:32
@BenjaminBossan
Copy link
Member

I haven't checked the PR yet, but I was wondering: When I do bigger refactors of tests, I always check the line coverage before and after the refactor and ensure that they're the same (in PEFT we use pytest-cov). IMO, having some kind of validation that the refactor maintains the same coverage (or increases it) is quite important, as it's hard to notice when suddenly, some lines of code are not tested anymore.

@sayakpaul
Copy link
Member Author

Indeed, it's important. Do you have any more guidelines for me to do that?

@BenjaminBossan
Copy link
Member

Do you have any more guidelines for me to do that?

So you can install pytest-cov if not already done and then add --cov=src/diffusers --cov-report=term-missing to your pytest invocation. Do this once for main and once for this branch. You get a report with something like this:

Name                                                  Stmts   Miss  Cover   Missing
-----------------------------------------------------------------------------------
src/peft/__init__.py                                     10      0   100%
src/peft/auto.py                                         71     34    52%   61, 82-142
src/peft/config.py                                      132     29    78%   39-43, 67, 89, 144, 148-172, 199-204, 225-228, 240, 242, 261-268, 344
src/peft/helpers.py                                      72     72     0%   15-251
src/peft/import_utils.py                                 94     56    40%   25, 30-35, 40-46, 54-73, 81, 87-98, 108, 128-147, 156-167, 172
src/peft/mapping.py                                      21     10    52%   25-26, 44, 65-78
src/peft/mapping_func.py                                 36     11    69%   73, 79, 85-89, 96, 103, 110, 123-125
[...]
src/peft/utils/peft_types.py                             58      6    90%   137, 140, 143, 156, 159, 163
-----------------------------------------------------------------------------------
TOTAL                                                 16679  12939    22%

It can be a bit difficult to parse what has changed, but basically, you want the missing column after the refactor to be 1) the same as before or 2) a subset (i.e. strictly better coverage). You could write a small tool to compare before and after (I know I did it once but can't find the snippet anymore) or just ask an LLM, they are quite good at finding the differences.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants