Skip to content

Add check for second value in sum: Logsumexp #90

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 11 commits into from
Closed
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 13 additions & 7 deletions tests/fixtures/misc/checker/logsumexp.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,18 @@
b = torch.randn(5)

# logsumexp
y = torch.log(torch.sum(torch.exp(x), 1, keepdim=True))
y = torch.log(torch.sum(torch.exp(2.5 + x), 1))
y = torch.log(torch.sum(torch.exp(x), 1, keepdim=True)) # has all the arguments for sum function call with keepdim as True
y = torch.log(torch.sum(torch.exp(2.5 + x), 1)) # addition operation inside the exp function call
y = torch.log(torch.sum(torch.exp(x),dim=1,keepdim=True)) # has all the arguments for sum function call
y = torch.log(torch.sum(torch.exp(x), dim=1)) #default value of keepdim is False

# not logsumexp
y = torch.log(torch.sum(torch.exp(x), 1, keepdim=True) + 2.5)
y = torch.log(torch.sum(torch.exp(x) + 2.5, 1))
y = torch.log(2 + x)
y = torch.sum(torch.log(torch.exp(x)), 1)
y = torch.exp(torch.sum(torch.log(x), 1, keepdim=True))
y = torch.log(torch.sum(torch.exp(x), 1, keepdim=True) + 2.5) # cant have an addition operation inside the log function call
y = torch.log(torch.sum(torch.exp(x) + 2.5, 1)) # Cant have an addition operation inside the sum function call with the argument as it expects a tensor
y = torch.log(2 + x) # missing sum and exp
y = torch.sum(torch.log(torch.exp(x)), 1) # not proper order of log and sum
y = torch.exp(torch.sum(torch.log(x), 1, keepdim=True)) #order of log,sum and exp is reversed
y = torch.log(torch.sum(torch.exp(2.5))) # this should not be flagged as the second argument is missing for sum function call and exp function call has an integer argument instead of a tensor
y = torch.log(torch.sum(torch.exp(x)), dim=1) #dim is not part of the sum fuction call
y = torch.log(torch.sum(torch.exp(x)), dim=None) #dim is not part of the sum fuction call and dim is None
y = torch.log(torch.sum(torch.exp(x), keepdim=True, dim=None)) #dim argument cannot be None
2 changes: 2 additions & 0 deletions tests/fixtures/misc/checker/logsumexp.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
6:5 TOR108 Use numerically stabilized `torch.logsumexp`.
7:5 TOR108 Use numerically stabilized `torch.logsumexp`.
8:5 TOR108 Use numerically stabilized `torch.logsumexp`.
9:5 TOR108 Use numerically stabilized `torch.logsumexp`.
21 changes: 15 additions & 6 deletions torchfix/visitors/misc/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,9 +184,18 @@ def visit_Call(self, node):
)
== "torch.exp"
):
self.add_violation(
node,
error_code=self.ERRORS[0].error_code,
message=self.ERRORS[0].message(),
replacement=None,
)
if self.get_specific_arg(
node.args[0].value, arg_name="dim", arg_pos=1
):
if (
self.get_specific_arg(
node.args[0].value, arg_name="dim", arg_pos=1
).value.value
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You only check for value of the argument when it's present.
It should be two nested if's - first if it's present, then if value is not None.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first "if" condition on line 187 checks for the presence of "dim" and then if confirmed it is moved to the second "if" in line 190.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see now.
You should assign the arg to a variable to reduce code duplication.
And then add assert that it's not None, otherwise MyPy is complaining:

https://github.com/pytorch-labs/torchfix/actions/runs/13081681874/job/36506448560?pr=90

See this for example https://github.com/pytorch-labs/torchfix/blob/main/torchfix/visitors/deprecated_symbols/__init__.py#L35

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You may also need to use ensure_type, like here: https://github.com/pytorch-labs/torchfix/blob/main/torchfix/visitors/deprecated_symbols/qr.py#L19

Please run mypy locally and verify it passes.

Copy link
Contributor Author

@shivam096 shivam096 Jan 31, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I ran the code and mypy errors are only resolved when I do a isinstance() check.

And in doing so I need to check for both Integer and Tuple since the dim can have both of the types as its argument value and not of type Name which is there when None value is there.

And since value of tuple cannot be retrieved through .value. I need to update the code to different code structure that could handle both integer and tuple values for dim which are Integers and are not None.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have updated the code and have made the necessary changes to handle any future type based issues.

Updated the test cases as well

!= "None"
):
self.add_violation(
node,
error_code=self.ERRORS[0].error_code,
message=self.ERRORS[0].message(),
replacement=None,
)
Loading