Skip to content

Commit d74f383

Browse files
Merge pull request #825 from run-ai/v2.18
V2.18
2 parents d66467e + a7576e5 commit d74f383

File tree

108 files changed

+3902
-896
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

108 files changed

+3902
-896
lines changed

.github/workflows/deploy-staging.yaml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name: publish docs CI to staging
1+
name: deploy docs CI to staging
22

33
on:
44
workflow_dispatch:
@@ -45,4 +45,4 @@ jobs:
4545

4646
- name: Sync output to S3
4747
run: |
48-
aws s3 sync ./site/ s3://${{ inputs.bucket_name }} --delete
48+
aws s3 sync ./site/ s3://${{ inputs.bucket_name }} --delete

docs/Researcher/Walkthroughs/quickstart-inference.md

Lines changed: 13 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -2,44 +2,43 @@
22

33
## Introduction
44

5-
Machine learning (ML) inference is the process of running live data points into a machine-learning algorithm to calculate an output.
5+
Machine learning (ML) inference is the process of running live data points into a machine-learning algorithm to calculate an output.
66

7-
With Inference, you are taking a trained _Model_ and deploying it into a production environment. The deployment must align with the organization's production standards such as average and 95% response time as well as up-time.
7+
With Inference, you are taking a trained *Model* and deploying it into a production environment. The deployment must align with the organization's production standards such as average and 95% response time as well as up-time.
88

9-
## Prerequisites
9+
## Prerequisites
1010

1111
To complete this Quickstart you must have:
1212

13-
* Run:ai software installed on your Kubernetes cluster. See: [Installing Run:ai on a Kubernetes Cluster](../../admin/runai-setup/installation-types.md). There are additional prerequisites for running inference. See [cluster installation prerequisites](../../admin/runai-setup/cluster-setup/cluster-prerequisites.md#inference) for more information.
13+
* Run:ai software installed on your Kubernetes cluster. See: [Installing Run:ai on a Kubernetes Cluster](../../admin/runai-setup/installation-types.md). There are additional prerequisites for running inference. See [cluster installation prerequisites](../../admin/runai-setup/cluster-setup/cluster-prerequisites.md#inference) for more information.
1414
* Run:ai CLI installed on your machine. See: [Installing the Run:ai Command-Line Interface](../../admin/researcher-setup/cli-install.md)
15-
* You must have _ML Engineer_ access rights. See [Adding, Updating and Deleting Users](../../admin/admin-ui-setup/admin-ui-users.md) for more information.
15+
* You must have *ML Engineer* access rights. See [Adding, Updating and Deleting Users](../../admin/admin-ui-setup/admin-ui-users.md) for more information.
1616

1717
## Step by Step Walkthrough
1818

1919
### Setup
2020

21-
* Login to the Projects area of the Run:ai user interface.
22-
* Add a Project named "team-a".
23-
* Allocate 2 GPUs to the Project.
21+
* Login to the Projects area of the Run:ai user interface.
22+
* Add a Project named "team-a".
23+
* Allocate 2 GPUs to the Project.
2424

25-
### Run an Inference Workload
25+
### Run an Inference Workload
2626

27-
* In the Run:ai user interface go to `Deployments`. If you do not see the `Deployments` section you may not have the required access control, or the inference module is disabled.
27+
* In the Run:ai user interface go to `Deployments`. If you do not see the `Deployments` section you may not have the required access control, or the inference module is disabled.
2828
* Select `New Deployment` on the top right.
2929
* Select `team-a` as a project and add an arbitrary name. Use the image `gcr.io/run-ai-demo/example-triton-server`.
3030
* Under `Resources` add 0.5 GPUs.
31-
* Under `Auto Scaling` select a minimum of 1, a maximum of 2. Use the `concurrency` autoscaling threshold method. Add a threshold of 3.
31+
* Under `Autoscaling` select a minimum of 1, a maximum of 2. Use the `concurrency` autoscaling threshold method. Add a threshold of 3.
3232
* Add a `Container port` of `8000`.
3333

34-
3534
This would start an inference workload for team-a with an allocation of a single GPU. Follow up on the Job's progress using the [Deployment list](../../admin/admin-ui-setup/deployments.md) in the user interface or by running `runai list jobs`
3635

3736
### Query the Inference Server
3837

3938
The specific inference server we just created is accepting queries over port 8000. You can use the Run:ai Triton demo client to send requests to the server:
4039

4140
* Find an IP address by running `kubectl get svc -n runai-team-a`. Use the `inference1-00001-private` Cluster IP.
42-
* Replace `<IP>` below and run:
41+
* Replace `<IP>` below and run:
4342

4443
```
4544
runai submit inference-client -i gcr.io/run-ai-demo/example-triton-client \
@@ -52,11 +51,10 @@ The specific inference server we just created is accepting queries over port 800
5251
runai logs inference-client
5352
```
5453

55-
5654
### View status on the Run:ai User Interface
5755

5856
* Open the Run:ai user interface.
59-
* Under _Deployments_ you can view the new Workload. When clicking the workload, note the utilization graphs go up.
57+
* Under *Deployments* you can view the new Workload. When clicking the workload, note the utilization graphs go up.
6058

6159
### Stop Workload
6260

@@ -66,4 +64,3 @@ Use the user interface to delete the workload.
6664

6765
* You can also create Inference deployments via API. For more information see [Submitting Workloads via YAML](../../developer/cluster-api/submit-yaml.md).
6866
* See [Deployment](../../admin/admin-ui-setup/deployments.md) user interface.
69-
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
---
2+
title: Run:ai Command-line Interface
3+
summary: This article is the summary article for the CLI V2.
4+
authors:
5+
- Jason Novich
6+
date: 2024-Jun-18
7+
---
8+
9+
## Summary
10+
11+
The Run:ai Command-line Interface (CLI) tool for a Researcher to send deep learning workloads, acquire GPU-based containers, list jobs, and access other features in the Run:ai platform.
12+
13+
```
14+
runai [flags]
15+
```
16+
17+
### Options
18+
19+
```
20+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
21+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
22+
-d, --debug enable debug mode
23+
-h, --help help for runai
24+
-v, --verbose enable verbose mode
25+
```
26+
27+
### See Also
28+
29+
* [runai cluster](runai_cluster.md)&mdash;cluster management
30+
* [runai config](runai_config.md)&mdash;configuration management
31+
* [runai list](runai_list.md)&mdash;[Deprecated] display resource list. By default displays the job list
32+
* [runai login](runai_login.md)&mdash;login to the control plane
33+
* [runai logout](runai_logout.md)&mdash;logout from control plane
34+
* [runai node](runai_node.md)&mdash;node management
35+
* [runai nodepool](runai_nodepool.md)&mdash;node pool management
36+
* [runai project](runai_project.md)&mdash;project management
37+
* [runai report](runai_report.md)&mdash;report management
38+
* [runai training](runai_training.md)&mdash;training management
39+
* [runai upgrade](runai_upgrade.md)&mdash;upgrades the CLI to the latest version
40+
* [runai version](runai_version.md)&mdash;print version information
41+
* [runai workload](runai_workload.md)&mdash;workload management
42+
* [runai workspace](runai_workspace.md)&mdash;workspace management
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
## runai cluster
2+
3+
cluster management
4+
5+
```
6+
runai cluster [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
-h, --help help for cluster
13+
--interactive enable set interactive mode (enabled|disabled)
14+
```
15+
16+
### Options inherited from parent commands
17+
18+
```
19+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
20+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
21+
-d, --debug enable debug mode
22+
-v, --verbose enable verbose mode
23+
```
24+
25+
### SEE ALSO
26+
27+
* [runai](runai.md) - Run:ai Command-line Interface
28+
* [runai cluster list](runai_cluster_list.md) - cluster list command
29+
* [runai cluster set](runai_cluster_set.md) - set cluster context
30+
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
## runai cluster list
2+
3+
cluster list command
4+
5+
```
6+
runai cluster list [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
-h, --help help for list
13+
--json Output structure JSON
14+
--table Output structure table
15+
--yaml Output structure YAML
16+
```
17+
18+
### Options inherited from parent commands
19+
20+
```
21+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
22+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
23+
-d, --debug enable debug mode
24+
-v, --verbose enable verbose mode
25+
```
26+
27+
### SEE ALSO
28+
29+
* [runai cluster](runai_cluster.md) - cluster management
30+
Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
## runai cluster set
2+
3+
set cluster context
4+
5+
```
6+
runai cluster set [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
-h, --help help for set
13+
--id string set by cluster ID
14+
--name string set by cluster name
15+
```
16+
17+
### Options inherited from parent commands
18+
19+
```
20+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
21+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
22+
-d, --debug enable debug mode
23+
-v, --verbose enable verbose mode
24+
```
25+
26+
### SEE ALSO
27+
28+
* [runai cluster](runai_cluster.md) - cluster management
29+
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
## runai config
2+
3+
configuration management
4+
5+
```
6+
runai config [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
-h, --help help for config
13+
--interactive enable set interactive mode (enabled|disabled)
14+
```
15+
16+
### Options inherited from parent commands
17+
18+
```
19+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
20+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
21+
-d, --debug enable debug mode
22+
-v, --verbose enable verbose mode
23+
```
24+
25+
### SEE ALSO
26+
27+
* [runai](runai.md) - Run:ai Command-line Interface
28+
* [runai config generate](runai_config_generate.md) - generate config file
29+
* [runai config set](runai_config_set.md) - Set configuration values
30+
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
## runai config generate
2+
3+
generate config file
4+
5+
```
6+
runai config generate [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
--file string Output structure to file
13+
-h, --help help for generate
14+
--json Output structure JSON
15+
--yaml Output structure YAML
16+
```
17+
18+
### Options inherited from parent commands
19+
20+
```
21+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
22+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
23+
-d, --debug enable debug mode
24+
-v, --verbose enable verbose mode
25+
```
26+
27+
### SEE ALSO
28+
29+
* [runai config](runai_config.md) - configuration management
30+
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
## runai config set
2+
3+
Set configuration values
4+
5+
```
6+
runai config set [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
--auth-url string set the authorization URL
13+
--cp-url string set the control plane URL
14+
-h, --help help for set
15+
--interactive enable set interactive mode (enabled|disabled)
16+
--output string set the default output type
17+
```
18+
19+
### Options inherited from parent commands
20+
21+
```
22+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
23+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
24+
-d, --debug enable debug mode
25+
-v, --verbose enable verbose mode
26+
```
27+
28+
### SEE ALSO
29+
30+
* [runai config](runai_config.md) - configuration management
31+
Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
## runai list
2+
3+
[Deprecated] display resource list. By default displays the job list
4+
5+
```
6+
runai list [flags]
7+
```
8+
9+
### Options
10+
11+
```
12+
-A, --all-projects list jobs from all projects
13+
-h, --help help for list
14+
-p, --project string Specify the project to which the command applies. By default, commands apply to the default project. To change the default project use ‘runai config project <project name>’
15+
```
16+
17+
### Options inherited from parent commands
18+
19+
```
20+
--config-file string config file name; can be set by environment variable RUNAI_CLI_CONFIG_FILE (default "config.json")
21+
--config-path string config path; can be set by environment variable RUNAI_CLI_CONFIG_PATH (default "~/.runai/")
22+
-d, --debug enable debug mode
23+
-v, --verbose enable verbose mode
24+
```
25+
26+
### SEE ALSO
27+
28+
* [runai](runai.md) - Run:ai Command-line Interface
29+
* [runai list clusters](runai_list_clusters.md) - [Deprecated] list all available clusters
30+
* [runai list jobs](runai_list_jobs.md) - [Deprecated] list all jobs
31+
* [runai list nodes](runai_list_nodes.md) - [Deprecated] list all nodes
32+
* [runai list projects](runai_list_projects.md) - [Deprecated] list all available projects
33+

0 commit comments

Comments
 (0)