Passing instructions to LLM via Prompt-Question #28578
Replies: 1 comment 4 replies
-
You can add new lines to the query_transform_prompt = ChatPromptTemplate.from_messages(
[
MessagesPlaceholder(variable_name="messages"),
(
"user",
"""Given the above conversation, generate a search query to look up in order to get information relevant to the conversation. Only respond with the query, nothing else.
Write appropriate context for each and every task based on what you see in the query. In case the conversation has no content related to the query, please feel free to come up with a real-life experience."""
),
]
) This approach keeps your code clean and maintains the prompt's readability without needing to create a new prompt object. Just ensure that the logic and structure of your prompt remain consistent with your intended instructions for the LLM. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I prefer to provide clear instructions to the LLM through the prompt. My question is whether I can add new lines to the 'query_transform_prompt' or if I should create a new prompt and reference it in the code. This is my code—please help
Beta Was this translation helpful? Give feedback.
All reactions