Skip to content

Stop wrapping optimizers to simplify use fo torch optimizers #18

@havakv

Description

@havakv

Right now all optimizers are wrapped, so to access a torch optimizer object we need to call model.optimizer.optimizer. It might makes sense to be able to get the torch optimizer with model.optimizer.

If we continue to wrap torch optimizers, maybe make it a wrapper objects model.optimizer_wrapper.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions