We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent f231147 commit 47e71e0Copy full SHA for 47e71e0
README.md
@@ -803,6 +803,9 @@ You can read more about the inference response parameters in the [parameters
803
extension](https://github.com/triton-inference-server/server/blob/main/docs/protocol/extension_parameters.md)
804
documentation.
805
806
+Inference response parameters is currently not supported on BLS inference
807
+responses received by BLS models.
808
+
809
## Managing Python Runtime and Libraries
810
811
Python backend shipped in the [NVIDIA GPU Cloud](https://ngc.nvidia.com/)
0 commit comments