Skip to content

add_loss implementation bug? #21289

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
franklin5 opened this issue May 16, 2025 · 1 comment
Closed

add_loss implementation bug? #21289

franklin5 opened this issue May 16, 2025 · 1 comment
Assignees
Labels

Comments

@franklin5
Copy link
Contributor

I am reading this line not believing my eyes:

https://github.com/keras-team/keras/blob/master/keras/src/layers/layer.py#L1205-L1207


def add_loss(self, loss):
        ...
        # Eager only.
        losses = tree.flatten(loss)
        ...
        if backend.in_stateless_scope():
            scope = backend.get_stateless_scope()
            if scope.collect_losses:
                for x in losses:
                    scope.add_loss(loss)
                    self._loss_ids.add(id(loss))
        else:
            ...

loss is the input, losses are flattened, x is for looping losses, but scope.add_loss and _loss_ids.add are repeatedly adding loss, not x? it'd be wrong if loss is a list of scalar tensor, no?

unittests are not covering the loss as a list of scalar tensor cases, https://github.com/keras-team/keras/blob/master/keras/src/layers/layer_test.py#L540-L567

@franklin5
Copy link
Contributor Author

thanks Francois
#21291 is merged

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants