Override PromptValues: Using multi-step LLMChains #1664
sirsimonson
started this conversation in
General
Replies: 2 comments 4 replies
-
Beta Was this translation helpful? Give feedback.
4 replies
-
I think we've put in a bugfix - #1645 are you on latest version? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I have the following question:
Given a Flow with 2 LLMChains (one of them beeing the output prediction) and 2 ChatPromptTemplates. In the 2nd LLMChain I use the output-prediction value as a promptValue in the 2nd PromptTemplate. When using the overrideConfig I must override ALL PromptValues and can not reuse the output_prediction. I get the error please provide promptValues for "x". What am I doing wrong?
Is it possible to override only a few PromptValues?
Here is my flow: (I want to reuse inference_result but override the context)



Beta Was this translation helpful? Give feedback.
All reactions