Replies: 5 comments 3 replies
-
To dynamically pass grammar to a from langchain_community.llms.huggingface_endpoint import HuggingFaceEndpoint
# Define your grammar or any other additional parameters
grammar1 = {
"some_grammar_key": "some_grammar_value1"
}
grammar2 = {
"some_grammar_key": "some_grammar_value2"
}
# Initialize the HuggingFaceEndpoint with model_kwargs
llm_base = HuggingFaceEndpoint(
endpoint_url="http://localhost:8010/",
max_new_tokens=1024,
temperature=0.1,
seed=60,
model_kwargs=grammar1 # Pass the initial grammar here
).configurable_fields(
temperature=ConfigurableField(
id="llm_temperature",
name="LLM Temperature",
description="The temperature of the LLM",
)
)
print(llm_base.invoke("pick a random number"))
print('-' * 50)
# Update the grammar dynamically using with_config
llm_with_grammar1 = llm_base.with_config(model_kwargs=grammar1)
llm_with_grammar2 = llm_base.with_config(model_kwargs=grammar2)
print(llm_with_grammar1.invoke("pick a random number"))
print('-' * 50)
print(llm_with_grammar2.invoke("pick a random number")) In this example, the |
Beta Was this translation helpful? Give feedback.
-
I found a solution as follow:
How can I make the above code cleaner? E.g., how can I chain chat_model with
|
Beta Was this translation helpful? Give feedback.
-
We have the following error:
|
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
Traceback (most recent call last): |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I'm trying to pass grammar dynamically to the HuggingfaceEndpoint. So far, the cleanest way that I found was using
configuration
, however the above code gives me an error like:System Info
System Information
Package Information
Beta Was this translation helpful? Give feedback.
All reactions