Error loading custom detection model after conversion #7114
Unanswered
Zenmaster188
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I trained the Paddle detection model with my own custom data, and it get the best accuracy and latest model.
when i tried to export it and convert it to the required model format using the command:
python3 tools/export_model.py -c configs/det/det_r50_vd_db.yml -o Global.pretrained_model="./output/det_r50_vd/latest" Global.save_inference_dir="./output/det_db_inference/"
it gave the output as :
W0804 12:55:34.817917 4102 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 6.0, Driver API Version: 11.0, Runtime API
Version: 10.2
W0804 12:55:34.822103 4102 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.
[2022/08/04 12:55:35] ppocr INFO: load pretrain successful from ./output/det_r50_vd/best_accuracy
[2022/08/04 12:55:38] ppocr INFO: inference model is saved to ./output/det_db_inference/inference
and when i loaded the model using the below command:
python3 tools/infer/predict_det.py --det_algorithm="DB" --det_model_dir="./output/det_db_inference/" --image_dir="../image"
--use_gpu=True
Traceback (most recent call last):
File "tools/infer/predict_det.py", line 262, in
text_detector = TextDetector(args)
File "tools/infer/predict_det.py", line 121, in init
args, 'det', logger)
File "/home/user/paddle/PaddleOCR/tools/infer/utility.py", line 317, in create_predictor
predictor = inference.create_predictor(config)
ValueError: (InvalidArgument) The inverse of Fused batch norm variance should be finite. Found nonfinite values! Please check batch_norm_55.w_2
[Hint: Expected std::isfinite(variance_array[i]) == true, but received std::isfinite(variance_array[i]):0 != true:1.] (at /paddle/paddle/fluid/framework/ir/conv_bn_fuse_pass.cc:105)
I am using :
PaddlePaddle 2.3.0, compiled with
with_avx: ON
with_gpu: ON
with_mkl: ON
with_mkldnn: ON
with_python: ON
Beta Was this translation helpful? Give feedback.
All reactions