Skip to content

Update CLI commands #1427

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Feb 27, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 13 additions & 15 deletions docs/Researcher/Walkthroughs/quickstart-inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,10 +34,10 @@ As described, the inference client can be created via CLI. To perform this, you

### Login

=== "CLI V1"
=== "CLI V2"
Run `runai login` and enter your credentials.

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Run `runai login` and enter your credentials.

=== "User Interface"
Expand Down Expand Up @@ -65,11 +65,10 @@ Under `Environments` Select __NEW ENVIRONMENT__. Then select:

### Run an Inference Workload


=== "CLI V1"
=== "CLI V2"
Not available right now.

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Not available right now.

=== "User Interface"
Expand Down Expand Up @@ -145,22 +144,21 @@ You can use the Run:ai Triton demo client to send requests to the server

* Copy the inference endpoint URL.

=== "CLI V1"
=== "CLI V2"
Open a terminal and run:

``` bash
runai config project team-a
runai submit inference-client-1 -i runai.jfrog.io/demo/example-triton-client \
-- perf_analyzer -m inception_graphdef -p 3600000 -u <INFERENCE-ENDPOINT>
runai project set team-a
runai training submit inference-client-1 -i runai.jfrog.io/demo/example-triton-client \
-- perf_analyzer -m inception_graphdef -p 3600000 -u <INFERENCE-ENDPOINT>
```


=== "CLI V2"
=== "CLI V1 (Deprecated)"
Open a terminal and run:

``` bash
runai project set team-a
runai training submit inference-client-1 -i runai.jfrog.io/demo/example-triton-client \
runai config project team-a
runai submit inference-client-1 -i runai.jfrog.io/demo/example-triton-client \
-- perf_analyzer -m inception_graphdef -p 3600000 -u <INFERENCE-ENDPOINT>
```

Expand All @@ -185,10 +183,10 @@ In the user interface, under `inference-server-1`, go to the `Metrics` tab and w

Run the following:

=== "CLI V1"
=== "CLI V2"
Not available right now

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Not available right now

=== "User Interface"
Expand Down
33 changes: 16 additions & 17 deletions docs/Researcher/Walkthroughs/quickstart-vscode.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,10 +30,10 @@ To complete this Quickstart __via the CLI__, you will need to have the Run:ai CL

### Login

=== "CLI V1"
=== "CLI V2"
Run `runai login` and enter your credentials.

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Run `runai login` and enter your credentials.

=== "User Interface"
Expand All @@ -57,29 +57,28 @@ Under `Environments` Select __NEW ENVIRONMENT__. Then select:

### Run Workload


=== "CLI V1"
=== "CLI V2"
Open a terminal and run:

``` bash
runai config project team-a
runai submit vs1 --jupyter -g 1
runai project set team-a
runai workspace submit vs1 --image quay.io/opendatahub-contrib/workbench-images:vscode-datascience-c9s-py311_2023c_latest \
--gpu-devices-request 1 --external-url container=8787
```

!!! Note
For more information on the workload submit command, see [cli documentation](../cli-reference/runai-submit.md).
For more information on the workspace submit command, see [cli documentation](../cli-reference/new-cli/runai_workspace_submit.md).

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Open a terminal and run:

``` bash
runai project set team-a
runai workspace submit vs1 --image quay.io/opendatahub-contrib/workbench-images:vscode-datascience-c9s-py311_2023c_latest \
--gpu-devices-request 1 --external-url container=8787
runai config project team-a
runai submit vs1 --jupyter -g 1
```

!!! Note
For more information on the workspace submit command, see [cli documentation](../cli-reference/new-cli/runai_workspace_submit.md).
For more information on the workload submit command, see [cli documentation](../cli-reference/runai-submit.md).

=== "User Interface"
* In the Run:ai UI select __Workloads__
Expand Down Expand Up @@ -141,16 +140,16 @@ Via the Run:ai user interface, go to `Workloads`, select the `vs1` Workspace and

Run the following:

=== "CLI V1"
``` bash
runai delete job vs1
```

=== "CLI V2"
```
runai workspace delete vs1
```

=== "CLI V1 (Deprecated)"
``` bash
runai delete job vs1
```

=== "User Interface"
Select the Workspace and press __DELETE__.

45 changes: 33 additions & 12 deletions docs/Researcher/Walkthroughs/walkthrough-build-ports.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,23 @@

* At the command-line run:

``` bash
runai config project team-a
runai submit nginx-test -i zembutsu/docker-sample-nginx --interactive
runai port-forward nginx-test --port 8080:80
```
=== "CLI V2"
Open a terminal and run:

``` bash
runai project set team-a
runai training submit nginx-test -i zembutsu/docker-sample-nginx
runai port-forward nginx-test --port 8080:80
```

=== "CLI V1 (Deprecated)"
Open a terminal and run:

```shell
runai config project team-a
runai submit nginx-test -i zembutsu/docker-sample-nginx --interactive
runai port-forward nginx-test --port 8080:80
```

* The Job is based on a sample _NGINX_ webserver docker image `zembutsu/docker-sample-nginx`. Once accessed via a browser, the page shows the container name.
* Note the _interactive_ flag which means the Job will not have a start or end. It is the Researcher's responsibility to close the Job.
Expand All @@ -37,13 +49,22 @@ runai port-forward nginx-test --port 8080:80

The result will be:

``` bash
The job 'nginx-test-0' has been submitted successfully
You can run `runai describe job nginx-test-0 -p team-a` to check the job status

Forwarding from 127.0.0.1:8080 -> 80
Forwarding from [::1]:8080 -> 80
```
=== "CLI V2"
```shell
Creating workspace nginx-test...
To track the workload's status, run 'runai workspace list'

port-forward stared, opening ports [8080:80]
```

=== "CLI V1 (Deprecated)"
```shell
The job 'nginx-test-0' has been submitted successfully
You can run `runai describe job nginx-test-0 -p team-a` to check the job status

Forwarding from 127.0.0.1:8080 -> 80
Forwarding from [::1]:8080 -> 80
```

### Access the Webserver

Expand Down
68 changes: 33 additions & 35 deletions docs/Researcher/Walkthroughs/walkthrough-build.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,10 @@ To complete this Quickstart __via the CLI__, you will need to have the Run:ai CL
## Step by Step Quickstart

### Login

=== "CLI V1"
=== "CLI V2"
Run `runai login` and enter your credentials.

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Run `runai login` and enter your credentials.

=== "User Interface"
Expand All @@ -43,27 +42,26 @@ To complete this Quickstart __via the CLI__, you will need to have the Run:ai CL

### Create a Workspace


=== "CLI V1"
=== "CLI V2"
Open a terminal and run:

``` bash
runai config project team-a
runai submit build1 -i ubuntu -g 1 --interactive -- sleep infinity
runai project set team-a
runai workspace submit build1 -i ubuntu -g 1 --command -- sleep infinity
```

!!! Note
For more information on the workload submit command, see [cli documentation](../cli-reference/runai-submit.md).
For more information on the workspace submit command, see [cli documentation](../cli-reference/new-cli/runai_workspace_submit.md).

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Open a terminal and run:

``` bash
runai project set team-a
runai workspace submit build1 -i ubuntu -g 1 --command -- sleep infinity
runai config project team-a
runai submit build1 -i ubuntu -g 1 --interactive -- sleep infinity
```

!!! Note
For more information on the workspace submit command, see [cli documentation](../cli-reference/new-cli/runai_workspace_submit.md).
For more information on the workload submit command, see [cli documentation](../cli-reference/runai-submit.md).

=== "User Interface"
* In the Run:ai UI select __Workloads__
Expand Down Expand Up @@ -115,14 +113,6 @@ To complete this Quickstart __via the CLI__, you will need to have the Run:ai CL

Follow up on the Workload's progress by running:

=== "CLI V1"
``` bash
runai list jobs
```
The result:
![mceclip20.png](img/mceclip20.png)


=== "CLI V2"
``` bash
runai workspace list
Expand All @@ -136,6 +126,14 @@ Follow up on the Workload's progress by running:
vs1 Workspace Running team-a No 1/1 1.00
```

=== "CLI V1 (Deprecated)"
``` bash
runai list jobs
```
The result:
![mceclip20.png](img/mceclip20.png)


=== "User Interface"
* Open the Run:ai user interface.
* Under "Workloads" you can view the new Workspace:
Expand All @@ -159,14 +157,14 @@ A full list of Job statuses can be found [here](../../platform-admin/workloads/o

To get additional status on your Workload run:

=== "CLI V1"
=== "CLI V2"
``` bash
runai describe job build1
runai workspace describe build1
```

=== "CLI V2"
=== "CLI V1 (Deprecated)"
``` bash
runai workspace describe build1
runai describe job build1
```

=== "User Interface"
Expand All @@ -175,15 +173,15 @@ To get additional status on your Workload run:

### Get a Shell to the container

=== "CLI V1"
Run:
=== "CLI V2"
``` bash
runai bash build1
runai workspace bash build1
```

=== "CLI V2"
=== "CLI V1 (Deprecated)"
Run:
``` bash
runai workspace bash build1
runai bash build1
```

This should provide a direct shell into the computer
Expand All @@ -195,16 +193,16 @@ This should provide a direct shell into the computer

Run the following:

=== "CLI V1"
``` bash
runai delete job build1
```

=== "CLI V2"
```
runai workspace delete build1
```

=== "CLI V1 (Deprecated)"
``` bash
runai delete job build1
```

=== "User Interface"
Select the Workspace and press __DELETE__.

Expand Down
Loading