Skip to content

Update integration-overview.md #1474

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 21, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 13 additions & 17 deletions docs/platform-admin/integrations/integration-overview.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,30 @@

# Integrations with Run:ai

The table below summarizes the integration capabilities with various third-party products.

## Integration support

Support for integrations varies. Where mentioned below, the integration is supported out of the box with Run:ai. With other integrations, our customer success team has previous experience with integrating with the third party software and many times the community portal will contain additional reference documentation provided on an as-is basis.

The Run:ai community portal is password protected and access is provided to customers and partners.
Support for third-party integrations varies. When noted below, the integration is supported out of the box with Run:ai. For other integrations, our Customer Success team has prior experience assisting customers with setup. In many cases, the NVIDIA Enterprise Support Portal may include additional reference documentation provided on an as-is basis.

## Integrations

| Tool | Category | Run:ai support details | Additional Information|
| ------------------ | ----------------| --------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Triton | Orchestration | Supported | Usage via docker base image. Quickstart inference [example](../../Researcher/Walkthroughs/quickstart-inference.md) |
| Spark | Orchestration | Community Support | <div style="width: 300px;"> It is possible to schedule Spark workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. </div> Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI](https://runai.my.site.com/community/s/article/How-to-Run-Spark-jobs-with-Run-AI){target=_blank} |
| Kubeflow Pipelines | Orchestration | Community Support | It is possible to schedule kubeflow pipelines with the Run:ai scheduler. For details please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal<br>[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Apache Airflow | Orchestration | Community Support | It is possible to schedule Airflow workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Argo workflows | Orchestration | Community Support | It is possible to schedule Argo workflows with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Argo-Workflows){target=_blank} |
| SeldonX | Orchestration | Community Support | It is possible to schedule Seldon Core workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Seldon-Core](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Jupyter Notebook | Development | Supported | Run:ai provides integrated support with Jupyter Notebooks. Quickstart example: [https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/](https://docs.run.ai/latest/Researcher/Walkthroughs/quickstart-jupyter/) |
| Jupyter Hub | Development | Community Support | It is possible to submit Run:ai workloads via JupyterHub. For more information please contact Run:ai customer support | |
| Spark | Orchestration | Community Support | <div style="width: 300px;"> It is possible to schedule Spark workflows with the Run:ai Scheduler. Sample code: [How to Run Spark job with Run:ai](https://enterprise-support.nvidia.com/s/article/How-to-Run-Spark-jobs-with-Run-AI){target=_blank}. |
| Kubeflow Pipelines | Orchestration | Community Support | It is possible to schedule kubeflow pipelines with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with Kubeflow](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Apache Airflow | Orchestration | Community Support |It is possible to schedule Airflow workflows with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with Apache Airflow](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-Apache-Airflow){target=_blank} |
| Argo workflows | Orchestration | Community Support | It is possible to schedule Argo workflows with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with Argo Workflows](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-Argo-Workflows){target=_blank} |
| SeldonX | Orchestration | Community Support |It is possible to schedule Seldon Core workloads with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with Seldon Core](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-Seldon-Core){target=_blank} |
| Jupyter Notebook | Development | Supported | Run:ai provides integrated support with Jupyter Notebooks. Quickstart [example](../../Researcher/workloads/workspaces/quickstart-jupyter.md) |
| Jupyter Hub | Development | Community Support | It is possible to submit Run:ai workloads via JupyterHub. Sample code: [How to connect JupyterHub with Run:ai](https://enterprise-support.nvidia.com/s/article/How-to-connect-JupyterHub-with-Run-ai){target=_blank} |
| PyCharm | Development | Supported | Containers created by Run:ai can be accessed via PyCharm. PyCharm [example](../../Researcher/tools/dev-pycharm.md) |
| VScode | Development | Supported | - Containers created by Run:ai can be accessed via Visual Studio Code. [example](../../Researcher/tools/dev-vscode.md) <br>- You can automatically launch Visual Studio code web from the Run:ai console. [example](../../Researcher/Walkthroughs/quickstart-vscode.md). |
| Kubeflow notebooks | Development | Community Support | It is possible to launch a kubeflow notebook with the Run:ai scheduler. For details please contact Run:ai customer support Sample code can be found in the Run:ai customer success community portal:[https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow<br>](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Ray | training, inference, data processing. | Community Support | It is possible to schedule Ray jobs with the Run:ai scheduler. Sample code can be found in the Run:ai customer success community portal [https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray](https://runai.my.site.com/community/s/article/How-to-Integrate-Run-ai-with-Ray){target=_blank} |
| Kubeflow notebooks | Development | Community Support | It is possible to schedule kubeflow notebooks with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with Kubeflow](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-Kubeflow){target=_blank} |
| Ray | training, inference, data processing. | Community Support |It is possible to schedule Ray jobs with the Run:ai Scheduler. Sample code: [How to Integrate Run:ai with Ray](https://enterprise-support.nvidia.com/s/article/How-to-Integrate-Run-ai-with-Ray){target=_blank} |
| TensorBoard | Experiment tracking | Supported | Run:ai comes with a preset Tensorboard [Environment](../workloads/assets/environments.md) asset. TensorBoard [example](../../Researcher/tools/dev-tensorboard.md). <br> Additional [sample](https://github.com/run-ai/use-cases/tree/master/runai_tensorboard_demo_with_resnet){target=_blank} |
| Weights & Biases | Experiment tracking | Community Support | It is possible to schedule W&B workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | Run:ai Customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases](https://runai.my.site.com/community/s/article/How-to-integrate-with-Weights-and-Biases){target=_blank} <br> Additional samples [here](https://github.com/run-ai/use-cases/tree/master/runai_wandb){target=_blank} |
| ClearML | Experiment tracking | Community Support | It is possible to schedule ClearML workloads with the Run:ai scheduler. For details, please contact Run:ai customer success. | [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-ClearML){target=_blank} |
| MLFlow | Model Serving | Community Support | It is possible to use ML Flow together with the Run:ai scheduler. For details, please contact Run:ai customer support. Sample code can be found in the Run:ai customer success community portal: [https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow](https://runai.my.site.com/community/s/article/How-to-integrate-Run-ai-with-MLflow){target=_blank} <br> Additional MLFlow [sample](https://github.com/run-ai/use-cases/tree/master/runai_mlflow_demo){target=_blank} |
| Weights & Biases | Experiment tracking | Community Support |It is possible to schedule W&B workloads with the Run:ai Scheduler. Sample code: [How to integrate with Weights and Biases](https://enterprise-support.nvidia.com/s/article/How-to-integrate-with-Weights-and-Biases){target=_blank} <br> Additional samples [here](https://github.com/run-ai/use-cases/tree/master/runai_wandb){target=_blank} |
| ClearML | Experiment tracking | Community Support | It is possible to schedule ClearML workloads with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with ClearML](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-ClearML){target=_blank} |
| MLFlow | Model Serving | Community Support | It is possible to use ML Flow together with the Run:ai Scheduler. Sample code: [How to integrate Run:ai with MLFlow](https://enterprise-support.nvidia.com/s/article/How-to-integrate-Run-ai-with-MLflow){target=_blank} <br> Additional MLFlow [sample](https://github.com/run-ai/use-cases/tree/master/runai_mlflow_demo){target=_blank} |
| Hugging Face | Repositories | Supported | Run:ai provides an out of the box integration with Hugging Face |
| Docker Registry | Repositories | Supported | Run:ai allows using a docker registry as a [Credentials](../workloads/assets/credentials.md) asset. |
| S3 | Storage | Supported | Run:ai communicates with S3 by defining a [data source](../workloads/assets/datasources.md) asset. |
Expand Down