-
Notifications
You must be signed in to change notification settings - Fork 73
Open
Description
So when you have an input tensor like lengths
for a model that isn't used it often gets stripped out by onnx (or pytorch?). For example when you have a Classifier that uses the LSTM where the lengths are needed length will be needed in the inputs. If you are using something like a Conv Net classifier where the length is never used it will get stripped out. This means that if you send a lengths tensor you will get an error.
We normally decide what to send based on the model.assests file so we should be filter the inputs based on the ort.InputSession(...).get_inputs()
and it should work out? In the onnx service we might need to use this method to filter it I'm not sure if the ONNX service ever checks the model.assests file
Metadata
Metadata
Assignees
Labels
No labels