Skip to content

Improve TorchScript model identification logic in torch_model.py #546

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 22, 2025

Conversation

gigony
Copy link
Collaborator

@gigony gigony commented Jul 21, 2025

  • Updated copyright notice to reflect the years 2021-2025.
  • Enhanced the logic for identifying TorchScript models by clarifying the required files and directories in the zip archive format.

Address #544

- Updated copyright notice to reflect the years 2021-2025.
- Enhanced the logic for identifying TorchScript models by clarifying the required files and directories in the zip archive format.

Signed-off-by: Gigon Bae <gbae@nvidia.com>
@gigony gigony requested a review from MMelQin July 21, 2025 21:21
@gigony gigony self-assigned this Jul 21, 2025
@gigony gigony added bug Something isn't working enhancement New feature or request labels Jul 21, 2025
Copy link

Copy link
Collaborator

@MMelQin MMelQin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @gigony for addressing this!

Edit after the merge: TorchScript itself does not save constants in a constants folder in the archive, see here, and the model file mentioned in the issue was exported with --trt_export flag, so it is the MONAI Bundle that added the constants folder instead of constants.pkl for trt optimized model.

@gigony gigony merged commit dec9305 into main Jul 22, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants