-
Hi, I converted the model to tflite and also tried converting it to google coral, but the issue remains.
Environment
Additional contextThank you very much |
Beta Was this translation helpful? Give feedback.
Replies: 8 comments 2 replies
-
Hello! Thank you for getting in touch with us. ARTPEC-8 cameras only support Tensorflow 1. There is a guide on how to train and convert your model from TensorFlow1 to Larod in the tensorflow-to-larod-artpec8 example. Specifically, the model conversion is done in convert_model.py. |
Beta Was this translation helpful? Give feedback.
-
Thank you very much. I'm trying to convert custom models trained from object-detection-api (tensorflow 1.15) , but I need to add more parameters to the conversor like --allow_custom_ops in order to work, but when loaded on the camera I get the same error (Could not load model: Could not build an interpreter of the model). EDIT: actually I only need the flag --allow_custom_ops because of the operator TFLite_Detection_PostProcess. Otherwise the flag is not needed so I suppose shouldn't be an issue. |
Beta Was this translation helpful? Give feedback.
-
Hi, unfortunately custom ops are not supported at the moment, you can't use that model directly on the camera |
Beta Was this translation helpful? Give feedback.
-
Hi, thank you very much for the answer. The output of the example is a list of several TFLite_Detection_PostProcess values (line 43 of https://github.com/AxisCommunications/acap-computer-vision-sdk-examples/blob/main/object-detector-python/app/detector.py) so I suppose in the conversion to tflite the custom ops had to be done somehow on the model. best regards |
Beta Was this translation helpful? Give feedback.
-
Alright, I see, so you are using a standard model (ssd_mobilenetv2) on a custom dataset? That should be working. Did you implement the model by yourself? If your model needs --allow_custom_ops in your conversion, it is in general a sign that it won't work on the hardware, because custom ops are not allowed at the moment. We also have this guide Of course, you can also implement your model, but you need to make sure not to use custom ops. |
Beta Was this translation helpful? Give feedback.
-
Many thanks for the detailed answer. To convert the model I use the script export_tflite_ssd_graph.py and next tflite_convert script, where fails if not set allow_custom_ops. |
Beta Was this translation helpful? Give feedback.
-
Hi. thanks for the suggestion. I've tried this model but the custom operation parameter is still needed. |
Beta Was this translation helpful? Give feedback.
-
I used ssd_mobilenet_v2_quantized_coco from model zoo of tensorflow 1. Now I'm trying other models, to see how fast is performed the inference while having the best model possible. |
Beta Was this translation helpful? Give feedback.
Hello!
Thank you for getting in touch with us. ARTPEC-8 cameras only support Tensorflow 1. There is a guide on how to train and convert your model from TensorFlow1 to Larod in the tensorflow-to-larod-artpec8 example. Specifically, the model conversion is done in convert_model.py.