Skip to content

Commit 2b21579

Browse files
authored
ODSC-36358. [ads_docs] Add raw-request examples for Deployment section (#31)
oci cli raw-request examples to model deployment prediction
1 parent b6ab4a0 commit 2b21579

File tree

9 files changed

+683
-17
lines changed

9 files changed

+683
-17
lines changed

docs/source/user_guide/cli/opctl/localdev/condapack.rst

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,14 @@ create
2424
2525
Build conda packs from your workstation using ``ads opctl conda create`` subcommand.
2626

27+
.. admonition:: Tip
28+
29+
To publish a conda pack that is natively installed on a oracle linux host (compute or laptop), use ``NO_CONTAINER`` environment variable to remove dependency on the ml-job container image:
30+
31+
.. code-block:: shell
32+
33+
NO_CONTAINER=1 ads opctl conda publish -s <slug> --auth <api_key/instance_principal/resource_principal>
34+
2735
-------
2836
publish
2937
-------
@@ -32,8 +40,7 @@ publish
3240
3341
ads opctl conda publish -s <slug>
3442
35-
Publish conda pack to the object storage bucket from your laptop or workstation. You can use this conda pack inside ``OCI Data Science Jobs``, ``OCI Data Science Notebooks`` and ``OCI Data Science Model Deployment``
36-
43+
Publish conda pack to the object storage bucket from your laptop or workstation. You can use this conda pack inside ``OCI Data Science Service`` or ``Data Flow service``.
3744

3845
-------
3946
install
@@ -43,4 +50,4 @@ Install conda pack using its URI. The conda pack can be used inside the docker i
4350

4451
.. code-block:: shell
4552
46-
ads opctl conda install -u "oci://mybucket@namespace/conda_environment/path/to/my/conda"
53+
ads opctl conda install -u "oci://mybucket@namespace/conda_environment/path/to/my/conda"

docs/source/user_guide/model_registration/frameworks/lightgbmmodel.rst

Lines changed: 122 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -121,6 +121,128 @@ Run Prediction against Endpoint
121121
122122
[1,0,...,1]
123123
124+
Run Prediction with oci raw-request command
125+
===========================================
126+
127+
Model deployment endpoints can be invoked with the OCI-CLI. The below examples invoke a model deployment with the CLI with different types of payload: ``json``, ``numpy.ndarray``, ``pandas.core.frame.DataFrame`` or ``dict``.
128+
129+
`json` payload example
130+
----------------------
131+
132+
.. code-block:: python3
133+
134+
>>> # Prepare data sample for prediction
135+
>>> data = testx[[11]]
136+
>>> data
137+
array([[ 0.59990614, 0.95516275, -1.22366985, -0.89270887, -1.14868768,
138+
-0.3506047 , 0.28529227, 2.00085413, 0.31212668, 0.39411511,
139+
0.87301082, -0.01743923, -1.15089633, 1.03461823, 1.50228029]])
140+
141+
Use printed output of the data and endpoint to invoke prediction with raw-request command in terminal:
142+
143+
.. code-block:: bash
144+
145+
export uri=https://modeldeployment.{region}.oci.customer-oci.com/ocid1.datasciencemodeldeployment.oc1.xxx.xxxxx/predict
146+
export data='{"data": [[ 0.59990614, 0.95516275, ... , 1.50228029]]}'
147+
oci raw-request \
148+
--http-method POST \
149+
--target-uri $uri \
150+
--request-body "$data"
151+
152+
`numpy.ndarray` payload example
153+
-------------------------------
154+
155+
.. code-block:: python3
156+
157+
>>> # Prepare data sample for prediction
158+
>>> from io import BytesIO
159+
>>> import base64
160+
>>> import numpy as np
161+
162+
>>> data = testx[[10]]
163+
>>> np_bytes = BytesIO()
164+
>>> np.save(np_bytes, data, allow_pickle=True)
165+
>>> data = base64.b64encode(np_bytes.getvalue()).decode("utf-8")
166+
>>> print(data)
167+
k05VTVBZAQB2AHsnZGVzY......D1cJ+D8=
168+
169+
Use printed output of `base64` data and endpoint to invoke prediction with raw-request command in terminal:
170+
171+
.. code-block:: bash
172+
173+
export uri=https://modeldeployment.{region}.oci.customer-oci.com/ocid1.datasciencemodeldeployment.oc1.xxx.xxxxx/predict
174+
export data='{"data":"k05VTVBZAQB2AHsnZGVzY......4UdN0L8=", "data_type": "numpy.ndarray"}'
175+
oci raw-request \
176+
--http-method POST \
177+
--target-uri $uri \
178+
--request-body "$data"
179+
180+
`pandas.core.frame.DataFrame` payload example
181+
---------------------------------------------
182+
183+
.. code-block:: python3
184+
185+
>>> # Prepare data sample for prediction
186+
>>> import pandas as pd
187+
188+
>>> df = pd.DataFrame(testx[[10]])
189+
>>> print(json.dumps(df.to_json())
190+
"{\"0\":{\"0\":0.4133005141},\"1\":{\"0\":0.676589266},...,\"14\":{\"0\":-0.2547168443}}"
191+
192+
Use printed output of ``DataFrame`` data and endpoint to invoke prediction with raw-request command in terminal:
193+
194+
.. code-block:: bash
195+
196+
export uri=https://modeldeployment.{region}.oci.customer-oci.com/ocid1.datasciencemodeldeployment.oc1.xxx.xxxxx/predict
197+
export data='{"data":"{\"0\":{\"0\":0.4133005141},...,\"14\":{\"0\":-0.2547168443}}","data_type":"pandas.core.frame.DataFrame"}'
198+
oci raw-request \
199+
--http-method POST \
200+
--target-uri $uri \
201+
--request-body "$data"
202+
203+
`dict` payload example
204+
----------------------
205+
206+
>>> # Prepare data sample for prediction
207+
>>> import pandas as pd
208+
209+
>>> df = pd.DataFrame(testx[[11]])
210+
>>> print(json.dumps(df.to_dict()))
211+
{"0": {"0": 0.5999061426438217}, "1": {"0": 0.9551627492226553}, ...,"14": {"0": 1.5022802918908846}}
212+
213+
Use printed output of ``dict`` data and endpoint to invoke prediction with raw-request command in terminal:
214+
215+
.. code-block:: bash
216+
217+
export uri=https://modeldeployment.{region}.oci.customer-oci.com/ocid1.datasciencemodeldeployment.oc1.xxx.xxxxx/predict
218+
export data='{"data": {"0": {"0": 0.5999061426438217}, ...,"14": {"0": 1.5022802918908846}}}'
219+
oci raw-request \
220+
--http-method POST \
221+
--target-uri $uri \
222+
--request-body "$data"
223+
224+
Expected output of raw-request command
225+
--------------------------------------
226+
227+
.. code-block:: bash
228+
229+
{
230+
"data": {
231+
"prediction": [
232+
1
233+
]
234+
},
235+
"headers": {
236+
"Connection": "keep-alive",
237+
"Content-Length": "18",
238+
"Content-Type": "application/json",
239+
"Date": "Wed, 07 Dec 2022 18:31:05 GMT",
240+
"X-Content-Type-Options": "nosniff",
241+
"opc-request-id": "B67EED723ADD43D7BBA1AD5AFCCBD0C6/03F218FF833BC8A2D6A5BDE4AB8B7C12/6ACBC4E5C5127AC80DA590568D628B60",
242+
"server": "uvicorn"
243+
},
244+
"status": "200 OK"
245+
}
124246
125247
Example
126248
=======

docs/source/user_guide/model_registration/frameworks/pytorchmodel.rst

Lines changed: 137 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -267,17 +267,151 @@ Predict with Image
267267

268268
Predict Image by passing a uri, which can be http(s), local path, or other URLs
269269
(e.g. starting with “oci://”, “s3://”, and “gcs://”), of the image or a PIL.Image.Image object
270-
using the `image` argument in `predict()` to predict a single image.
270+
using the ``image`` argument in ``predict()`` to predict a single image.
271271
The image will be converted to a tensor and then serialized so it can be passed to the endpoint.
272-
You can catch the tensor in `score.py` to perform further transformation.
272+
You can catch the tensor in ``score.py`` to perform further transformation.
273+
274+
.. note::
275+
The payload size limit is 10 MB. Read more about invoking a model deployment `here <https://docs.oracle.com/iaas/data-science/using/model-dep-invoke.htm#model_dep_invoke>`_.
276+
277+
Given the size limitation, the example below is with resized image. To pass an image and invoke prediction, additional code inside score.py is required to preprocess the data. Open ``pytorch_model_artifact/score.py`` and update the ``pre_inference()`` method. The edits are highlighted:
278+
279+
.. code-block:: python3
280+
:emphasize-lines: 15-26
281+
282+
def pre_inference(data, input_schema_path):
283+
"""
284+
Preprocess json-serialized data to feed into predict function.
285+
286+
Parameters
287+
----------
288+
data: Data format as expected by the predict API of the core estimator.
289+
input_schema_path: path of input schema.
290+
291+
Returns
292+
-------
293+
data: Data format after any processing.
294+
"""
295+
data = deserialize(data, input_schema_path)
296+
import torchvision.transforms as transforms
297+
preprocess = transforms.Compose([
298+
transforms.Resize(256),
299+
transforms.CenterCrop(224),
300+
transforms.Normalize(
301+
mean=[0.485, 0.456, 0.406],
302+
std=[0.229, 0.224, 0.225]
303+
),
304+
])
305+
input_tensor = preprocess(data)
306+
input_batch = input_tensor.unsqueeze(0)
307+
return input_batch
308+
309+
Save ``score.py`` and verify prediction works:
310+
311+
.. code-block:: python3
312+
313+
>>> uri = ("https://github.com/oracle-samples/oci-data-science-ai-samples/tree/master/model_deploy_examples/images/dog_resized.jpg")
314+
>>> prediction = pytorch_model.verify(image=uri)["prediction"]
315+
>>> import numpy as np
316+
>>> np.argmax(prediction)
317+
258
318+
319+
Re-deploy model with updated ``score.py``:
320+
321+
.. code-block:: python3
322+
323+
pytorch_model.deploy(
324+
display_name="PyTorch Model For Classification",
325+
deployment_log_group_id="ocid1.loggroup.oc1.xxx.xxxxxx",
326+
deployment_access_log_id="ocid1.log.oc1.xxx.xxxxxx",
327+
deployment_predict_log_id="ocid1.log.oc1.xxx.xxxxxx",
328+
)
329+
330+
Run prediction with the image provided:
273331

274332
.. code-block:: python3
275333
276-
uri = ("https://github.com/pytorch/hub/raw/master/images/dog.jpg")
334+
uri = ("https://github.com/oracle-samples/oci-data-science-ai-samples/tree/master/model_deploy_examples/images/dog_resized.jpg")
277335
278336
# Generate prediction by invoking the deployed endpoint
279337
prediction = pytorch_model.predict(image=uri)['prediction']
280338
339+
Run Prediction with oci raw-request command
340+
===========================================
341+
342+
Model deployment endpoints can be invoked with the OCI-CLI. This example invokes a model deployment with the CLI with a ``torch.Tensor`` payload:
343+
344+
`torch.Tensor` payload example
345+
------------------------------
346+
347+
.. code-block:: python3
348+
349+
>>> # Prepare data sample for prediction and save it to file 'data-payload'
350+
>>> from io import BytesIO
351+
>>> import base64
352+
353+
>>> buffer = BytesIO()
354+
>>> torch.save(input_batch, buffer)
355+
>>> data = base64.b64encode(buffer.getvalue()).decode("utf-8")
356+
>>> with open('data-payload', 'w') as f:
357+
>>> f.write('{"data": "' + data + '", "data_type": "torch.Tensor"}')
358+
359+
File ``data-payload`` will have this information:
360+
361+
.. code-block:: bash
362+
363+
{"data": "UEsDBAAACAgAAAAAAAAAAAAAAAAAAAAAAAAQ ........................
364+
.......................................................................
365+
...AAAAEAAABQSwUGAAAAAAMAAwC3AAAA0jEJAAAA", "data_type": "torch.Tensor"}
366+
367+
Use file ``data-payload`` with data and endpoint to invoke prediction with raw-request command in terminal:
368+
369+
.. code-block:: bash
370+
371+
export uri=https://modeldeployment.{region}.oci.customer-oci.com/ocid1.datasciencemodeldeployment.oc1.xxx.xxxxx/predict
372+
oci raw-request \
373+
--http-method POST \
374+
--target-uri $uri \
375+
--request-body file://data-payload
376+
377+
Expected output of raw-request command
378+
--------------------------------------
379+
380+
.. code-block:: bash
381+
382+
{
383+
"data": {
384+
"prediction": [
385+
[
386+
0.0159152802079916,
387+
-1.5496551990509033,
388+
.......
389+
2.5116958618164062
390+
]
391+
]
392+
},
393+
"headers": {
394+
"Connection": "keep-alive",
395+
"Content-Length": "19398",
396+
"Content-Type": "application/json",
397+
"Date": "Thu, 08 Dec 2022 18:28:41 GMT",
398+
"X-Content-Type-Options": "nosniff",
399+
"opc-request-id": "BD80D931A6EA4C718636ECE00730B255/86111E71C1B33C24988C59C27F15ECDE/E94BBB27AC3F48CB68F41135073FF46B",
400+
"server": "uvicorn"
401+
},
402+
"status": "200 OK"
403+
}
404+
405+
Copy prediction output into `argmax` to retrieve result of the image prediction:
406+
407+
.. code-block:: python3
408+
409+
>>> print(np.argmax([[0.0159152802079916,
410+
>>> -1.5496551990509033,
411+
>>> .......
412+
>>> 2.5116958618164062]]))
413+
258
414+
281415
Example
282416
=======
283417

0 commit comments

Comments
 (0)