Skip to content

Commit 5b4f70d

Browse files
authored
ODSC-42082: Add usage of the ads opctl init to the ADS docs (#166)
2 parents 79044ea + 7ec06a8 commit 5b4f70d

File tree

3 files changed

+119
-9
lines changed

3 files changed

+119
-9
lines changed

docs/Makefile

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# Minimal makefile for Sphinx documentation
2+
#
3+
4+
# You can set these variables from the command line, and also
5+
# from the environment for the first two.
6+
SPHINXOPTS ?=
7+
SPHINXBUILD ?= sphinx-build
8+
SOURCEDIR = source
9+
BUILDDIR = build
10+
ZIP_TARGET = ads-latest.zip
11+
12+
# Put it first so that "make" without argument is like "make help".
13+
help:
14+
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
15+
16+
.PHONY: help Makefile
17+
18+
# Catch-all target: route all unknown targets to Sphinx using the new
19+
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
20+
%: Makefile
21+
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
22+
@rm -rf dask-worker-space
23+
24+
livehtml:
25+
sphinx-autobuild "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
26+
27+
zip: html
28+
@cd build/html && zip -r ../../"$(ZIP_TARGET)" .
29+
@echo "built: $(ZIP_TARGET)"
30+
31+
clean:
32+
rm -rf $(BUILDDIR)/*

docs/source/conf.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,11 @@
3232
"sphinx.ext.inheritance_diagram",
3333
"nbsphinx",
3434
"sphinx_code_tabs",
35-
"sphinx_copybutton"
35+
"sphinx_copybutton",
36+
"sphinx.ext.duration",
37+
"sphinx.ext.autosummary",
38+
"sphinx.ext.intersphinx",
39+
"sphinx.ext.viewcode",
3640
]
3741

3842
# Add any paths that contain templates here, relative to this directory.

docs/source/user_guide/cli/opctl/configure.rst

Lines changed: 82 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,12 @@
1+
#################
12
CLI Configuration
2-
=================
3+
#################
34

4-
**Prerequisite**
5+
.. _configuration_prerequisites:
56

6-
* You have completed :doc:`ADS CLI installation <../quickstart>`
7+
.. admonition:: Prerequisites
8+
9+
- You have completed :doc:`ADS CLI installation <../quickstart>`
710

811

912
Setup default values for different options while running ``OCI Data Sciecne Jobs`` or ``OCI DataFlow``. By setting defaults, you can avoid inputing compartment ocid, project ocid, etc.
@@ -40,14 +43,14 @@ This will prompt you to setup default ADS CLI configurations for each OCI profil
4043
subnet_id = oci.xxxx.<subnet-ocid>
4144
log_group_id = oci.xxxx.<log_group_ocid>
4245
log_id = oci.xxxx.<log_ocid>
43-
shape_name = VM.Standard2.2
46+
shape_name = VM.Standard.E2.4
4447
block_storage_size_in_GBs = 100
4548
4649
[ANOTHERPROF]
4750
compartment_id = oci.xxxx.<compartment_ocid>
4851
project_id = oci.xxxx.<project_ocid>
4952
subnet_id = oci.xxxx.<subnet-ocid>
50-
shape_name = VM.Standard2.1
53+
shape_name = VM.Standard.E2.4
5154
log_group_id =ocid1.loggroup.oc1.xxx.xxxxx
5255
log_id = oci.xxxx.<log_ocid>
5356
block_storage_size_in_GBs = 50
@@ -57,10 +60,10 @@ This will prompt you to setup default ADS CLI configurations for each OCI profil
5760

5861
.. code-block::
5962
60-
[MYTENANCYPROF]
63+
[DEFAULT]
6164
compartment_id = oci.xxxx.<compartment_ocid>
62-
driver_shape = VM.Standard2.1
63-
executor_shape = VM.Standard2.1
65+
driver_shape = VM.Standard.E2.4
66+
executor_shape = VM.Standard.E2.4
6467
logs_bucket_uri = oci://mybucket@mytenancy/dataflow/logs
6568
script_bucket = oci://mybucket@mytenancy/dataflow/mycode/
6669
num_executors = 3
@@ -79,6 +82,34 @@ This will prompt you to setup default ADS CLI configurations for each OCI profil
7982
compartment_id = oci.xxxx.<compartment_ocid>
8083
project_id = oci.xxxx.<project_ocid>
8184
85+
``~/.ads_ops/model_deployment_config.ini`` will contain defaults for deploying ``Data Science Model``. Defaults are set for each profile listed in your oci config file. Here is a sample -
86+
87+
.. code-block::
88+
89+
[DEFAULT]
90+
compartment_id = ocid1.compartment.oc1..<unique_ID>
91+
project_id = ocid1.datascienceproject.oc1.iad.<unique_ID>
92+
shape_name = VM.Standard.E2.4
93+
log_group_id = ocid1.loggroup.oc1.iad.<unique_ID>
94+
log_id = ocid1.log.oc1.iad.<unique_ID>
95+
compartment_id = oci.xxxx.<compartment_ocid>
96+
project_id = oci.xxxx.<project_ocid>
97+
bandwidth_mbps = 10
98+
replica = 1
99+
web_concurrency = 10
100+
101+
[ANOTHERPROF]
102+
compartment_id = ocid1.compartment.oc1..<unique_ID>
103+
project_id = ocid1.datascienceproject.oc1.iad.<unique_ID>
104+
shape_name = VM.Standard.E2.4
105+
log_group_id = ocid1.loggroup.oc1.iad.<unique_ID>
106+
log_id = ocid1.log.oc1.iad.<unique_ID>
107+
compartment_id = oci.xxxx.<compartment_ocid>
108+
project_id = oci.xxxx.<project_ocid>
109+
bandwidth_mbps = 20
110+
replica = 2
111+
web_concurrency = 20
112+
82113
83114
``~/.ads_ops/local_backend.ini`` will contain defaults for running jobs and pipeline steps locally. While local operations do not involve connections to OCI services, default
84115
configurations are still set for each profile listed in your oci config file for consistency. Here is a sample -
@@ -93,3 +124,46 @@ configurations are still set for each profile listed in your oci config file for
93124
[ANOTHERPROF]
94125
max_parallel_containers = 4
95126
pipeline_status_poll_interval_seconds = 5
127+
128+
129+
Generate Starter YAML
130+
---------------------
131+
132+
The examples demonstrated in this section show how to generate starter YAML specification for the Data Science Job, Data Flow Application, Data Science Model Deployment and ML Pipeline services. It takes into account the config files generated within ``ads opctl configure`` operation, as well as values extracted from the environment variables.
133+
134+
To generate starter specification run -
135+
136+
.. code-block::
137+
138+
ads opctl init --help
139+
140+
The resource type is a mandatory attribute that needs to be provided. Currently supported resource types - `dataflow`, `deployment`, `job` and `pipeline`.
141+
For instance to generate starter specification for the Data Science job, run -
142+
143+
.. code-block::
144+
145+
ads opctl init job
146+
147+
The resulting YAML will be printed in the console. By default the ``python`` runtime will be used.
148+
149+
150+
**Supported runtimes**
151+
152+
- For a ``job`` - `container`, `gitPython`, `notebook`, `python` and `script`.
153+
- For a ``pipeline`` - `container`, `gitPython`, `notebook`, `python` and `script`.
154+
- For a ``dataflow`` - `dataFlow` and `dataFlowNotebook`.
155+
- For a ``deployment`` - `conda` and `container`.
156+
157+
158+
If you want to specify a particular runtime use -
159+
160+
.. code-block::
161+
162+
ads opctl init job --runtime-type container
163+
164+
Use the ``--output`` attribute to save the result in a YAML file.
165+
166+
.. code-block::
167+
168+
ads opctl init job --runtime-type container --output job_with_container_runtime.yaml
169+

0 commit comments

Comments
 (0)