-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
The FINN build process fails with the following error when building the dummy transformer:
Traceback (most recent call last):
File "/dev/shm/temporary_finn_dir/finn/src/finn/builder/build_dataflow.py", line 158, in build_dataflow_cfg
model = transform_step(model, cfg)
File "/scratch/hpc-prf-ekiapp/linusjun/test/finn-on-n2/attention/build_steps.py", line 731, in step_apply_folding_config
verify_step(model, cfg, "folded_hls_cppsim", need_parent=True)
File "/dev/shm/temporary_finn_dir/finn/src/finn/builder/build_dataflow_steps.py", line 166, in verify_step
out_dict = execute_parent(parent_model_fn, child_model_fn, in_npy, return_full_ctx=True)
File "/dev/shm/temporary_finn_dir/finn/src/finn/util/test.py", line 171, in execute_parent
ret = execute_onnx(parent_model, {iname: input_tensor_npy}, True)
File "/dev/shm/temporary_finn_dir/finn/src/finn/core/onnx_exec.py", line 54, in execute_onnx
return execute_onnx_base(model, input_dict, return_full_exec_context, start_node, end_node)
File "/dev/shm/temporary_finn_dir/finn/deps/qonnx/src/qonnx/core/onnx_exec.py", line 179, in execute_onnx
execute_node(node, execution_context, graph, return_full_exec_context, opset_version)
File "/dev/shm/temporary_finn_dir/finn/deps/qonnx/src/qonnx/core/onnx_exec.py", line 87, in execute_node
sess = rt.InferenceSession(node_model.SerializeToString())
File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 454, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : This is an invalid model. In Node, ("Squeeze_0", Squeeze, "", -1) : ("global_in": tensor(float),"Squeeze_0_param0": tensor(int64),) -> ("Squeeze_0_out0": tensor(float),) , Error Unrecognized attribute: axes for operator Squeeze
> /usr/local/lib/python3.10/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py(454)_create_inference_session()
-> sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
Workaround:
- Use step_convert_attention_to_hw from https://github.com/iksnagreb/radioml-transformer/blob/conformer/build_steps.py#L282
- Remove step_tidy_up_post_attention from build.py
Metadata
Metadata
Assignees
Labels
No labels