Skip to content

Pruning ONNX inputs #717

@blester125

Description

@blester125

So when you have an input tensor like lengths for a model that isn't used it often gets stripped out by onnx (or pytorch?). For example when you have a Classifier that uses the LSTM where the lengths are needed length will be needed in the inputs. If you are using something like a Conv Net classifier where the length is never used it will get stripped out. This means that if you send a lengths tensor you will get an error.

We normally decide what to send based on the model.assests file so we should be filter the inputs based on the ort.InputSession(...).get_inputs() and it should work out? In the onnx service we might need to use this method to filter it I'm not sure if the ONNX service ever checks the model.assests file

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions