Skip to content

Commit 8737099

Browse files
committed
Update Gen AI endpoints in code examples.
1 parent 2ed7a55 commit 8737099

File tree

2 files changed

+13
-13
lines changed

2 files changed

+13
-13
lines changed

docs/source/user_guide/large_language_model/langchain_models.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ To use the text generation model as LLM in LangChain:
2626
compartment_id="<compartment_ocid>",
2727
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
2828
client_kwargs={
29-
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
29+
"service_endpoint": "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com"
3030
},
3131
)
3232
@@ -50,7 +50,7 @@ Here is an example of using prompt template and OCI generative AI LLM to build a
5050
compartment_id="<compartment_ocid>",
5151
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
5252
client_kwargs={
53-
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
53+
"service_endpoint": "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com"
5454
},
5555
)
5656
@@ -71,7 +71,7 @@ Similarly, you can use the embedding model:
7171
compartment_id="<compartment_ocid>",
7272
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
7373
client_kwargs={
74-
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
74+
"service_endpoint": "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com"
7575
},
7676
)
7777
@@ -115,7 +115,7 @@ By default, the integration uses the same authentication method configured with
115115
compartment_id="<compartment_ocid>",
116116
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
117117
client_kwargs={
118-
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
118+
"service_endpoint": "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com"
119119
},
120120
)
121121
@@ -132,6 +132,6 @@ Alternatively, you may use specific authentication for the model:
132132
compartment_id="<compartment_ocid>",
133133
# Optionally you can specify keyword arguments for the OCI client, e.g. service_endpoint.
134134
client_kwargs={
135-
"service_endpoint": "https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com"
135+
"service_endpoint": "https://inference.generativeai.us-chicago-1.oci.oraclecloud.com"
136136
},
137-
)
137+
)

docs/source/user_guide/large_language_model/retrieval.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ The following code snippet shows how to use the Generative AI Embedding Models:
2424
2525
oci_embedings = GenerativeAIEmbeddings(
2626
compartment_id="ocid1.compartment.####",
27-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
27+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
2828
)
2929
3030
Retrieval QA with OpenSearch
@@ -74,7 +74,7 @@ Since the search result usually cannot be directly used to answer a specific que
7474
7575
oci_llm = GenerativeAI(
7676
compartment_id="ocid1.compartment.####",
77-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
77+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
7878
)
7979
8080
retriever = opensearch_vector_search.as_retriever(search_kwargs={"vector_field": "embeds",
@@ -137,7 +137,7 @@ Similarly, you can use FAISS Vector Store as a retriever to build a retrieval QA
137137
138138
oci_llm = GenerativeAI(
139139
compartment_id="ocid1.compartment.####",
140-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
140+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
141141
)
142142
retriever = db.as_retriever()
143143
qa = RetrievalQA.from_chain_type(
@@ -214,12 +214,12 @@ Here is an example code snippet for deployment of Retrieval QA using OpenSearch
214214
215215
oci_embedings = GenerativeAIEmbeddings(
216216
compartment_id="ocid1.compartment.####",
217-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
217+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
218218
)
219219
220220
oci_llm = GenerativeAI(
221221
compartment_id="ocid1.compartment.####",
222-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
222+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
223223
)
224224
225225
import os
@@ -288,12 +288,12 @@ Here is an example code snippet for deployment of Retrieval QA using FAISS as a
288288
ads.set_auth("resource_principal")
289289
oci_embedings = GenerativeAIEmbeddings(
290290
compartment_id="ocid1.compartment.####",
291-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
291+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
292292
)
293293
294294
oci_llm = GenerativeAI(
295295
compartment_id="ocid1.compartment.####",
296-
client_kwargs=dict(service_endpoint="https://generativeai.aiservice.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
296+
client_kwargs=dict(service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com") # this can be omitted after Generative AI service is GA.
297297
)
298298
299299
loader = TextLoader("your.txt")

0 commit comments

Comments
 (0)