|
2 | 2 | Deploy LangChain Application
|
3 | 3 | ############################
|
4 | 4 |
|
5 |
| -Oracle ADS SDK now supports the deployment of LangChain application to OCI data science model deployment and you can easily do so just by writing a couple lines of code. |
| 5 | +Oracle ADS supports the deployment of LangChain application to OCI data science model deployment and you can easily do so just by writing a few lines of code. |
6 | 6 |
|
7 | 7 | .. versionadded:: 2.9.1
|
8 | 8 |
|
| 9 | +.. admonition:: Installation |
| 10 | + :class: note |
| 11 | + |
| 12 | + It is important to note that for ADS to serialize and deploy the LangChain application, all components used to build the application must be serializable. For more information regarding LLMs model serialization, see `here <https://python.langchain.com/docs/modules/model_io/llms/llm_serialization>`_. |
| 13 | + |
9 | 14 | Configuration
|
10 | 15 | *************
|
11 | 16 |
|
12 | 17 | Ensure that you have created the necessary `policies, authentication, and authorization for model deployments <https://docs.oracle.com/en-us/iaas/data-science/using/model-dep-policies-auth.htm#model_dep_policies_auth>`_.
|
13 |
| -Here we're using the ``resource_principal`` as auth type and you can configure the policy as below. |
| 18 | +For example, the following policy allows the dynamic group to use ``resource_principal`` to create model deployment. |
14 | 19 |
|
15 | 20 | .. code-block:: shell
|
16 | 21 |
|
17 | 22 | allow dynamic-group <dynamic-group-name> to manage data-science-model-deployments in compartment <compartment-name>
|
18 | 23 |
|
19 |
| -Create LangChain Application |
20 |
| -**************************** |
| 24 | +LangChain Application |
| 25 | +********************* |
21 | 26 |
|
22 |
| -Create a simple LangChain application that links prompt and cohere model as below. Remember to replace the ``<cohere_api_key>`` with the actual cohere api key. |
| 27 | +Following is a simple LangChain application that build with a prompt template and large language model API. Here the ``Cohere`` model is used as an example. You may replace it with any other LangChain compatible LLM, including OCI Generative AI service. |
23 | 28 |
|
24 | 29 | .. code-block:: python3
|
25 | 30 |
|
26 | 31 | import os
|
27 | 32 | from langchain.llms import Cohere
|
28 | 33 | from langchain.chains import LLMChain
|
29 | 34 | from langchain.prompts import PromptTemplate
|
| 35 | + # Remember to replace the ``<cohere_api_key>`` with the actual cohere api key. |
30 | 36 | os.environ["COHERE_API_KEY"] = "<cohere_api_key>"
|
31 | 37 |
|
32 | 38 | cohere = Cohere()
|
33 | 39 | prompt = PromptTemplate.from_template("Tell me a joke about {subject}")
|
34 | 40 | llm_chain = LLMChain(prompt=prompt, llm=cohere, verbose=True)
|
35 | 41 |
|
36 |
| -Now you have a LangChain object ``llm_chain``. Try running it with the prompt ``{"subject": "animals"}`` and it should give you the corresponding result. |
| 42 | +Now you have a LangChain object ``llm_chain``. Try running it with the input ``{"subject": "animals"}`` and it should return a joke about animals. |
37 | 43 |
|
38 | 44 | .. code-block:: python3
|
39 | 45 |
|
40 |
| - llm_chain.run({"subject": "animals"}) |
| 46 | + llm_chain.invoke({"subject": "animals"}) |
41 | 47 |
|
42 | 48 | Initialize the ChainDeployment
|
43 | 49 | ******************************
|
44 | 50 |
|
45 |
| -Initialize class ``ChainDeployment`` from ADS SDK and pass the LangChain object ``llm_chain`` from previous step as parameter. |
46 |
| -The ``artifact_dir`` is an optional parameter which points to the folder where the model artifacts will be put locally. |
47 |
| -In this example, we're using a temporary folder generated by ``tempfile``. |
| 51 | +ADS provides the ``ChainDeployment`` to handle the deployment of LangChain applications. |
| 52 | +You can initialize ``ChainDeployment`` with the LangChain object ``llm_chain`` from previous section as parameter. |
| 53 | +The ``artifact_dir`` is an optional parameter which points to the folder where the model artifacts will be put locally. |
| 54 | +In this example, we're using a temporary folder generated by ``tempfile.mkdtemp()``. |
48 | 55 |
|
49 | 56 | .. code-block:: python3
|
50 | 57 |
|
@@ -75,7 +82,7 @@ Here, replace ``custom_conda_environment_uri`` with your conda environment uri t
|
75 | 82 | inference_python_version="<python_version>",
|
76 | 83 | )
|
77 | 84 |
|
78 |
| -Below is the ``chain.yaml`` file that was saved from ``llm_chain`` object. For more information regarding LLMs model serialization, see `here <https://python.langchain.com/docs/modules/model_io/llms/llm_serialization>`_. |
| 85 | +Below is the ``chain.yaml`` file that was saved from ``llm_chain`` object. |
79 | 86 |
|
80 | 87 | .. code-block:: YAML
|
81 | 88 |
|
@@ -110,6 +117,16 @@ Below is the ``chain.yaml`` file that was saved from ``llm_chain`` object. For m
|
110 | 117 | tags: null
|
111 | 118 | verbose: true
|
112 | 119 |
|
| 120 | +Verify the Serialized Application |
| 121 | +********************************* |
| 122 | + |
| 123 | +Verify the serialized application by calling ``verify()`` to make sure it is working as expected. |
| 124 | +There will be error if your application is not fully serializable. |
| 125 | + |
| 126 | +.. code-block:: python3 |
| 127 | +
|
| 128 | + chain_deployment.verify({"subject": "animals"}) |
| 129 | +
|
113 | 130 | Save Artifacts to OCI Model Catalog
|
114 | 131 | ***********************************
|
115 | 132 |
|
|
0 commit comments