-
-
Notifications
You must be signed in to change notification settings - Fork 13
Description
hi, After I successfully converted two onnx models using the conversion code provided by the project, I need to use tensorrt to accelerate. In the process of onnx->engine, the encoder model can be converted to engine normally, but the decoder model has encountered problems, which are as follows:
[02/15/2025-15:37:40] [E] Error[4]: [graph.cpp::nvinfer1::builder::Node::symbolicExecute::535] Error Code 4: Internal Error (/OneHot: an IIOneHotLayer cannot be used to compute a shape tensor)
[02/15/2025-15:37:40] [E] [TRT] ModelImporter.cpp:768: While parsing node number 146 [Tile -> "/Tile_output_0"]:
[02/15/2025-15:37:40] [E] [TRT] ModelImporter.cpp:769: --- Begin node ---
[02/15/2025-15:37:40] [E] [TRT] ModelImporter.cpp:770: input: "/Unsqueeze_11_output_0"
input: "/Reshape_4_output_0"
output: "/Tile_output_0"
name: "/Tile"
op_type: "Tile"
[02/15/2025-15:37:40] [E] [TRT] ModelImporter.cpp:771: --- End node ---
[02/15/2025-15:37:40] [E] [TRT] ModelImporter.cpp:774: ERROR: ModelImporter.cpp:195 In function parseGraph:
[6] Invalid Node - /Tile
[graph.cpp::nvinfer1::builder::Node::symbolicExecute::535] Error Code 4: Internal Error (/OneHot: an IIOneHotLayer cannot be used to compute a shape tensor)
[02/15/2025-15:37:40] [E] Failed to parse onnx file
[02/15/2025-15:37:40] [I] Finished parsing network model. Parse time: 0.0610398
[02/15/2025-15:37:40] [E] Parsing model failed
[02/15/2025-15:37:40] [E] Failed to create engine from model or file.
[02/15/2025-15:37:40] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec [TensorRT v8600] # trtexec.exe --onnx=./sam2.1_tiny.onnx --saveEngine=./sam2.1_tiny.engine