Replies: 1 comment
-
🤖 Hey @med-dak, fancy seeing you back here! Hope you're doing well. Based on the context you've provided, it seems you want to get the complete prompts, including the prompt templates and the context data, that are sent to the LLM for every query. To achieve this, you can use the Here's an example of how you can use a from llama_index.prompts.default_prompts import DEFAULT_SUMMARY_PROMPT
# Assume we have the following context and query
context_str = "This is the context data."
query_str = "This is the query."
# Use the `PromptTemplate` to generate the full prompt
full_prompt = DEFAULT_SUMMARY_PROMPT.template.format(context_str=context_str, query_str=query_str)
print(full_prompt) This will print out the full prompt that can be sent to the LLM. Note that the actual placeholders in the template string depend on the specific For more details on how to use these classes, you can refer to the LlamaIndex documentation and the source code for the PromptTemplate and ChatPromptTemplate classes. If you want to log the full prompts for every query, you can use Python's built-in logging module. Here's an example of how you can do this: import logging
# Set up logging
logging.basicConfig(level=logging.DEBUG)
# Log the full prompt
logging.debug(f"Full prompt: {full_prompt}") This will print out the full prompt to the console every time a query is made. You can adjust the logging level and the output destination as needed. I hope this helps! If you have any further questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
How can I get the complete prompts (including the prompt templates and the context data) sent to the LLM for every query?
Beta Was this translation helpful? Give feedback.
All reactions