You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> The current release is version [2.51.0](https://github.com/triton-inference-server/server/releases/latest) and corresponds to the 24.10 container release on NVIDIA GPU Cloud (NGC).
35
+
33
36
Triton Inference Server is an open source inference serving software that
34
37
streamlines AI inferencing. Triton enables teams to deploy any AI model from
35
38
multiple deep learning and machine learning frameworks, including TensorRT,
0 commit comments