Skip to content

splitting lib into train and inference(onnx) #530

Answered by fg-mindee
felixdittrich92 asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @felixdittrich92

Thanks for the suggestion! ONNX is indeed on our radar since #5 😅
However, while ONNX should be one possible way to do inference, that should not close the door to others (OpenVINO, etc.)

That being said, the general idea is also something we were planning to do. But that needs in-depth discussion first on a few points:

  • how constraining will "being able to export in ONNX" be? (considering we need to make modifications about existing models)
  • how can we leave the door open for other IR / inference engines?
  • Fully agree on having default setup for inference, and making training an extra 👍

Looking forward to this discussion!

Replies: 4 comments 6 replies

Comment options

You must be logged in to vote
0 replies
Answer selected by felixdittrich92
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
4 replies
@fg-mindee
Comment options

@felixdittrich92
Comment options

@fg-mindee
Comment options

@felixdittrich92
Comment options

Comment options

You must be logged in to vote
2 replies
@felixdittrich92
Comment options

@fg-mindee
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
type: enhancement Improvement help wanted Extra attention is needed module: models Related to doctr.models topic: onnx ONNX-related
2 participants
Converted from issue

This discussion was converted from issue #518 on October 14, 2021 16:27.