You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- ``--ocid ocid1.datasciencemodel.oc1.iad.******``: Run the predict locally in a docker container when you pass in a model id. If you pass in a deployment id, e.g. ``ocid1.datasciencemodeldeployment.oc1.iad.``, it will actually predict against the remote endpoint. In that case, only ``--ocid`` and ``--payload`` are needed.
41
-
- ``--conda-slug myconda_p38_cpu_v1``: Use the ``myconda_p38_cpu_v1`` conda environment. Note that you must publish this conda environment to you own bucket first if you are actually using a service conda pack for your real deployment. However, you dont have to install it locally. We will find to auto detect the conda information from the custom metadata, then the runtime.yaml from your model artifact if --conda-slug and --conda-path is not provided. However, if detected conda are service packs, we will throw an error asking you to publish it first. You can also provide the conda information directly through --conda-slug or --conda-path. Note, in order to test whether deployemnt will work, you should provide the slug that you will use in real deployment. The local conda environment directory will be automatically mounted into the container and activated before the entrypoint is executed.
42
-
- ``--payload``: the payload to be passed to your model.
43
-
- ``--bucket-uri``: is used to download large model artifact to your local machine. Extra policy needs to be placed for this bucket. Check this :doc:`link <./user_guide/model_registration/model_load.html#large-model-artifacts>` for more details. Small artifact does not require this.
40
+
- ``--ocid``: Run the predict locally in a docker container when you pass in a model id. If you pass in a deployment id, e.g. ``ocid1.datasciencemodeldeployment.oc1.iad.``, it will actually predict against the remote endpoint. In that case, only ``--ocid`` and ``--payload`` are needed.
41
+
- ``--conda-slug myconda_p38_cpu_v1``: Use the ``myconda_p38_cpu_v1`` conda environment. The environment should be installed in your local already. If you haven't installed it, you can provide the path by ``--conda-path``, for example, ``--conda-path "oci://my-bucket@mytenancy/.../myconda_p38_cpu_v1"``, it will download and install the conda environment in your local. Note that you must publish this conda environment to you own bucket first if you are actually using a service conda pack for your real deployment. We will find to auto detect the conda information from the custom metadata, then the runtime.yaml from your model artifact if ``--conda-slug`` and ``--conda-path`` is not provided. However, if detected conda are service packs, we will throw an error asking you to publish it first. Note, in order to test whether deployemnt will work, you should provide the slug that you will use in real deployment. The local conda environment directory will be automatically mounted into the container and activated before the entrypoint is executed.
42
+
- ``--payload``: The payload to be passed to your model.
43
+
- ``--bucket-uri``: Used to download large model artifact to your local machine. Extra policy needs to be placed for this bucket. Check this :doc:`link <./user_guide/model_registration/model_load.html#large-model-artifacts>` for more details. Small artifact does not require this.
44
44
45
45
.. code-block:: shell
46
46
@@ -54,10 +54,9 @@ Running your Predict Against the Deployment Endpoint
0 commit comments