You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/source/user_guide/large_language_model/deploy_langchain_application.rst
+6-3Lines changed: 6 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -63,13 +63,16 @@ Prepare the Model Artifacts
63
63
64
64
Call ``prepare`` from ``ChainDeployment`` to generate the ``score.py`` and serialize the LangChain application to ``chain.yaml`` file under ``artifact_dir`` folder.
65
65
Parameters ``inference_conda_env`` and ``inference_python_version`` are passed to define the conda environment where your LangChain application will be running on OCI cloud.
66
-
Here we're using ``pytorch21_p39_gpu_v1`` with python 3.9.
66
+
Here, replace ``custom_conda_environment_uri`` with your conda environment uri that has the latest ADS 2.9.1 and replace ``python_version`` with your conda environment python version.
67
+
68
+
.. note::
69
+
For how to customize and publish conda environment, take reference to `Publishing a Conda Environment to an Object Storage Bucket <https://docs.oracle.com/en-us/iaas/data-science/using/conda_publishs_object.htm>`_
Below is the ``chain.yaml`` file that was saved from ``llm_chain`` object. For more information regarding LLMs model serialization, see `here <https://python.langchain.com/docs/modules/model_io/llms/llm_serialization>`_.
0 commit comments