Replies: 3 comments 4 replies
-
I test many code but nothing work.
I test many code but nothing work. |
Beta Was this translation helpful? Give feedback.
-
Hi, I found some cues in detectron2 website, maybe helpful for my exporting
my inputs is same as aboved one, so it can work in code by,
Thus, I rewrite some part of code in my way,
The problem is on opset_version, Besides, another cue is on detectron2/tools/deploy/export_model.py line 40
I tried aboved code but it still shows |
Beta Was this translation helpful? Give feedback.
-
Not successed yet, but I think it is close, I followed instruction in
However, it shows error in torch.onnx.export,
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
The problem is in My model, it take ViT as backbone and GeneralizedRCNN as object detection from detectron2. It use
List[dict_1{"image": , "height": , "width": }, dict_2{ "image": , "height": , "width": } ...]
as input, and can get results from model, the output also return List which includes, [ {"instances": Instances()}, {"instances": Instances()}...]Firstly, I tried
torch.onnx.export
to export onnx, but when I use same format to onnx, it only can get dict_1{} as input, and shows errorThus, I followed some tutor in detectron2 by
export_onnx_model
The error shows,
RuntimeError:[enforce fail at context_gpu.cu:240] . out of memory
I don't know whether the problem is on my GPU capacity or my input and is it possible to use detectron2 under this input format?P.S.
Beta Was this translation helpful? Give feedback.
All reactions