How to use a SmartLLMChain #10545
cobusgreyling
started this conversation in
General
Replies: 1 comment
-
I found the problem, it should be resolver_llm. The documentation is incorrectly referring to resolve_llm. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm trying to run the SmartLLMChain code (https://python.langchain.com/docs/use_cases/more/self_check/smart_llm), and use different LLMs for ideation_llm, critique_llm and resolve_llm.
ideation_llm and critique_llm work, but when I run resolve_llm with the code below:
chain = SmartLLMChain(
resolve_llm=ChatOpenAI(temperature=0.9, model_name="text-davinci-003"),
llm=ChatOpenAI(
temperature=0, model_name="gpt-4"
), # will be used for critqiue and resolution as no specific llms are given
prompt=prompt,
n_ideas=3,
verbose=True,
)
I get the error:
ValidationError: 1 validation error for SmartLLMChain
resolve_llm
extra fields not permitted (type=value_error.extra)
Beta Was this translation helpful? Give feedback.
All reactions