[Question]: How do I get the exact and full prompt that was passed to the LLM? #8418
Replies: 3 comments
-
🤖 Hello, Based on the context you've provided, the 'formatted_prompt' is indeed the exact prompt that is passed to the LLM. It is created by using the As for the 'query_wrapper_prompt', it's not explicitly mentioned in the context you've provided, so I can't provide a detailed explanation of its purpose. However, in general, prompts in the LlamaIndex framework are used to transform a query into another query given the existing context. They can also add instructions for formatting image output. If 'query_wrapper_prompt' is a part of the LlamaIndex framework, it might be used in a different part of the process not covered in the provided context. For a complete understanding, I would recommend referring to the relevant documentation or code where 'query_wrapper_prompt' is used or defined. If you could provide more information about where 'query_wrapper_prompt' is used in your code or its intended purpose, I might be able to provide a more detailed answer. Best, SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
I would jump in here, because have a similar issue - or lack of understanding.
So defining
But in the debug output I see
So, is the expectation wrong, that the llama specific prompt template is used? Thanks in advance |
Beta Was this translation helpful? Give feedback.
-
Did either of you resolve this? I've got similar questions |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Question Validation
context
I'm not sure
query_wrapper_prompt
is actually being used, as I see some conflicting information fromllama_debug
.So I would like to know exactly what prompt is passed to the LLM when
query_engine.query(...)
method is called.Example:
I first call
query_engine.query("My query")
Then I call
llama_debug.get_llm_inputs_outputs()
which returns:You can see that I have a
query_wrapper_prompt
that says[INST]<<SYS>>\nYou are an AI assistant that answers questions in a friendly manner, based on the given source documents. <</SYS>>\n\n{query_str}[/INST]"
While the
'formatted_prompt'
is'Context information is below.\n---------------------\n[retrieved documents]---------------------\nGiven the context information and not prior knowledge, answer the query.\nQuery: My query\nAnswer: '
So it looks like different template is being used.
Question:
Is
'formatted_prompt'
the prompt that is passed to the LLM?If yes, why does it not use my
query_wrapper_prompt
?If no, then what is the point of
query_wrapper_prompt
? where does it get used?Beta Was this translation helpful? Give feedback.
All reactions