Skip to content

Commit d8e3683

Browse files
committed
Add langchain_models.rst
1 parent 3e4aafe commit d8e3683

File tree

1 file changed

+137
-0
lines changed

1 file changed

+137
-0
lines changed
Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
LangChain Integration
2+
*********************
3+
4+
.. versionadded:: 2.9.1
5+
6+
LangChain compatible models/interfaces are needed for LangChain applications to invoke OCI generative AI service or LLMs deployed on OCI data science model deployment service.
7+
8+
.. admonition:: Preview Feature
9+
:class: note
10+
11+
While the official integration of OCI and LangChain will be added to the LangChain library, ADS provides a preview version of the integration.
12+
It it important to note that the APIs of the preview version may change in the future.
13+
14+
Integration with Generative AI
15+
==============================
16+
17+
The `OCI Generative AI service <https://www.oracle.com/artificial-intelligence/generative-ai/large-language-models/>`_ provide text generation, summarization and embedding models.
18+
19+
To use the text generation model as LLM in LangChain:
20+
21+
.. code-block:: python3
22+
23+
from ads.llm import GenerativeAI
24+
25+
llm = GenerativeAI(
26+
compartment_id="<compartment_ocid>",
27+
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
28+
client_kwargs={
29+
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
30+
},
31+
)
32+
33+
llm.invoke("Translate the following sentence into French:\nHow are you?\n")
34+
35+
Here is an example of using prompt template and OCI generative AI LLM to build a translation app:
36+
37+
.. code-block:: python3
38+
39+
from langchain.prompts import PromptTemplate
40+
from langchain.schema.runnable import RunnableParallel, RunnablePassthrough
41+
from ads.llm import GenerativeAI
42+
43+
# Map the input into a dictionary
44+
map_input = RunnableParallel(text=RunnablePassthrough())
45+
# Template for the input text.
46+
template = PromptTemplate.from_template(
47+
"Translate the text into French.\nText:{text}\nFrench translation: "
48+
)
49+
llm = GenerativeAI(
50+
compartment_id="<compartment_ocid>",
51+
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
52+
client_kwargs={
53+
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
54+
},
55+
)
56+
57+
# Build the app as a chain
58+
translation_app = map_input | template | llm
59+
60+
# Now you have a translation app.
61+
translation_app.invoke("How are you?")
62+
# "Comment ça va?"
63+
64+
Similarly, you can use the embedding model:
65+
66+
.. code-block:: python3
67+
68+
from ads.llm import GenerativeAIEmbeddings
69+
70+
embed = GenerativeAIEmbeddings(
71+
compartment_id="<compartment_ocid>",
72+
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
73+
client_kwargs={
74+
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
75+
},
76+
)
77+
78+
embed.embed_query("How are you?")
79+
80+
Integration with Model Deployment
81+
=================================
82+
83+
If you deploy open-source or your own LLM on OCI model deployment service using `vLLM <https://docs.vllm.ai/en/latest/>`_ or `HuggingFace TGI <https://huggingface.co/docs/text-generation-inference/index>`_ , you can use the ``ModelDeploymentVLLM`` or ``ModelDeploymentTGI`` to integrate your model with LangChain.
84+
85+
.. code-block:: python3
86+
87+
from ads.llm import ModelDeploymentVLLM
88+
89+
llm = ModelDeploymentVLLM(
90+
endpoint="https://<your_model_deployment_endpoint>/predict",
91+
model="<model_name>"
92+
)
93+
94+
.. code-block:: python3
95+
96+
from ads.llm import ModelDeploymentTGI
97+
98+
llm = ModelDeploymentTGI(
99+
endpoint="https://<your_model_deployment_endpoint>/predict",
100+
)
101+
102+
Authentication
103+
==============
104+
105+
By default, the integration uses the same authentication method configured with ``ads.set_auth()``. Optionally, you can also pass the ``auth`` keyword argument when initializing the model to use specific authentication method for the model. For example, to use resource principal for all OCI authentication:
106+
107+
.. code-block:: python3
108+
109+
import ads
110+
from ads.llm import GenerativeAI
111+
112+
ads.set_auth(auth="resource_principal")
113+
114+
llm = GenerativeAI(
115+
compartment_id="<compartment_ocid>",
116+
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
117+
client_kwargs={
118+
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
119+
},
120+
)
121+
122+
Alternatively, you may use specific authentication for the model:
123+
124+
.. code-block:: python3
125+
126+
import ads
127+
from ads.llm import GenerativeAI
128+
129+
llm = GenerativeAI(
130+
# Use security token authentication for the model
131+
auth=ads.auth.security_token(profile="my_profile"),
132+
compartment_id="<compartment_ocid>",
133+
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
134+
client_kwargs={
135+
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
136+
},
137+
)

0 commit comments

Comments
 (0)