-
Notifications
You must be signed in to change notification settings - Fork 364
🐛 [Bug] AssertionError: end must be an integer #3448
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@apbose please look at this bug |
Thanks for the issue. Trying to repro the above. |
@apbose Thank you for reply, subsequent_mask() looks like,
The whole repo of my project is customized and hard to summary, |
There are a couple of other things missing for the repro. The opt in |
@apbose |
Hmm I would need the code to repro the error and see what is going on. Looks like the lowering pass is not being able to handle a dynamic case. |
@apbose |
abose@nvidia.com you could share here. You could point here. |
@apbose |
I cannot find it. Could you please let me know the mail id from which you mailed. |
dusrb2003@naver.com |
Thanks received, I will take a look.
…On Tue, Apr 8, 2025, 4:20 PM Fulitcher ***@***.***> wrote:
***@***.***
just sent again.
—
Reply to this email directly, view it on GitHub
<#3448 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKRJMRZO65FR7EJUVHG3OYD2YRKSFAVCNFSM6AAAAABZMN4ZK6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOOBXHA2TCNZTGY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
*Fulitcher* left a comment (pytorch/TensorRT#3448)
<#3448 (comment)>
***@***.***
just sent again.
—
Reply to this email directly, view it on GitHub
<#3448 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKRJMRZO65FR7EJUVHG3OYD2YRKSFAVCNFSM6AAAAABZMN4ZK6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOOBXHA2TCNZTGY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
@Fulitcher I see this error. Please note that I am skipping the lines
Can that have an effect?
|
@apbose
|
Ok got it. Looks like I am getting a different error where the no of outputs differs from the output dtypes. Need to look into this further. |
Yep, Thanks for taking a look. |
So when I run your model, I see that there is mismatch betweem the no of outputs and the dtype, since there are symints and symints dependent ops which are appearing in the output. For whom dtypes are not allocated. |
Ok you have mentioned it above. Let me try with the above versions |
I could repro with the above versions. Working on fix |
Uh oh!
There was an error while loading. Please reload this page.
Bug Description
To Reproduce
Steps to reproduce the behavior:
Expected behavior
"rt_model.ep" model file must be created and saved.
Environment
conda
,pip
,libtorch
, source): Docker image "PyTorch Release 25.02" at linkAdditional context
Even if I do with adding or removing functions such as clone() and detach() to the 'probs' variable in the example code where the error occurred, the same error occurs.
!!! important !!!
Model transformation just succeeds when static batch is set as inputs like below.
Setting dynamic batch as inputs result the error as suggested.
Is there a way to take dynamic batch as inputs like the sample code provided as an example?
The text was updated successfully, but these errors were encountered: