Export rec-quant-model (svtr base) inference ? #7560
centurions
started this conversation in
General
Replies: 1 comment
-
Please update the latest code, it seems that the code you used is old. The function |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I get this error when I try to export inference from quant aware trained model (REC SVTR BASE MODEL)
paddleslim == 2.3.4
paddleocr == 2.6
paddlepaddle-gpu == 2.3.2.post111
Traceback (most recent call last):
File "deploy/slim/quantization/export_model.py", line 169, in
main()
File "deploy/slim/quantization/export_model.py", line 164, in main
export_single_model(model, arch_config, save_path, logger, quanter)
File "E:\new\PaddleOCR-release-2.6\tools\export_model.py", line 74, in export_single_model
shape=[None] + input_shape, dtype="float32"),
TypeError: can only concatenate list (not "QAT") to list
Beta Was this translation helpful? Give feedback.
All reactions