Skip to content

Commit da5e81b

Browse files
committed
ODSC-29065: adding doc
1 parent 999be58 commit da5e81b

File tree

2 files changed

+64
-0
lines changed

2 files changed

+64
-0
lines changed

docs/source/user_guide/cli/opctl/local-development-setup.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,3 +27,4 @@ Setup up your workstation for development and testing your code locally before y
2727
localdev/jobs
2828
localdev/local_jobs
2929
localdev/local_pipelines
30+
localdev/local_deployment
Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
++++++++++++++++++++++++++++++++
2+
Local Model Deployment Execution
3+
++++++++++++++++++++++++++++++++
4+
5+
Your can test whether deployment will work in a local container to facilitate development and troubleshooting by testing local predict.
6+
7+
-------------
8+
Prerequisites
9+
-------------
10+
11+
1. :doc:`Install ADS CLI<../../quickstart>`
12+
2. Build a container image.
13+
- :doc:`Build Development Container Image<./jobs_container_image>`
14+
15+
------------
16+
Restrictions
17+
------------
18+
19+
When running locally, your local predict is subject to the following restrictions:
20+
- The local predict must use API Key auth. Resource Principal auth is not supported in a local container. See https://docs.oracle.com/iaas/Content/API/Concepts/apisigningkey.htm
21+
- You can only use conda environment published to your own Object Storage bucket. See :doc:`Working with Conda packs<./condapack>`
22+
- Your model artifact files must be present on your local machine.
23+
24+
--------------------------
25+
Running your Local Predict
26+
--------------------------
27+
28+
.. note:: Right now, we only support testing deployment locally using a conda environment.
29+
30+
Using a conda environment
31+
=========================
32+
33+
This example below demonstrates how to run a local predict using an installed conda environment:
34+
35+
.. code-block:: shell
36+
37+
ads opctl predict --artifact-directory /folder/your/model/artifacts/are/stored --payload '[[-1.68671955,2.25814541,-0.5068027,0.25248417,0.62665134,0.23441123]]' --conda-slug myconda_p38_cpu_v1
38+
39+
Parameter explanation:
40+
- ``--ocid ocid1.datasciencemodel.oc1.iad.******``: Run the predict locally in a docker container when you pass in a model id. If you pass in a deployment id, e.g. ``ocid1.datasciencemodeldeployment.oc1.iad.``, it will actually predict against the remote endpoint. In that case, only ``--ocid`` and ``--payload`` are needed.
41+
- ``--conda-slug myconda_p38_cpu_v1``: Use the ``myconda_p38_cpu_v1`` conda environment. Note that you must publish this conda environment to you own bucket first if you are actually using a service conda pack for your real deployment. However, you dont have to install it locally. We will find to auto detect the conda information from the custom metadata, then the runtime.yaml from your model artifact if --conda-slug and --conda-path is not provided. However, if detected conda are service packs, we will throw an error asking you to publish it first. You can also provide the conda information directly through --conda-slug or --conda-path. Note, in order to test whether deployemnt will work, you should provide the slug that you will use in real deployment. The local conda environment directory will be automatically mounted into the container and activated before the entrypoint is executed.
42+
- ``--payload``: the payload to be passed to your model.
43+
- ``--bucket-uri``: is used to download large model artifact to your local machine. Extra policy needs to be placed for this bucket. Check this :doc:`link <./user_guide/model_registration/model_load.html#large-model-artifacts>` for more details. Small artifact does not require this.
44+
45+
.. code-block:: shell
46+
47+
ads opctl predict --artifact-directory /folder/your/model/artifacts/are/stored --payload '[[-1.68671955,2.25814541,-0.5068027,0.25248417,0.62665134,0.23441123]]'
48+
49+
Parameter explanation:
50+
- ``--artifact-directory /folder/your/model/artifacts/are/stored``: If you already have your model artifact stored in some local folder, you can use ``--artifact-directory`` instead of ``--ocid``.
51+
52+
----------------------------------------------------
53+
Running your Predict Against the Deployment Endpoint
54+
----------------------------------------------------
55+
.. code-block:: shell
56+
57+
ads opctl predict ocid1.datasciencemodeldeployment.oc1.iad.***** --payload '[[-1.68671955,2.25814541,-0.5068027,0.25248417,0.62665134,0.23441123]]'
58+
59+
60+
Parameter explanation:
61+
- ``--ocid ocid1.datasciencemodeldeployment.oc1.iad.******``: Run the predict remotely against the remote endpoint.
62+
- ``--payload``: The payload to be passed to your model.
63+

0 commit comments

Comments
 (0)