Custom prompt template #10495
Replies: 9 comments
-
🤖 Hey again, @PradiptaAtByteIQMind-Graph2023! Nice to see you diving into more features. To pass a custom prompt template to the Here is an example of how you can do this: from llama_index.prompts.base import BasePromptTemplate
from llama_index.query_engine.sql_join_query_engine import SQLJoinQueryEngine, SQLAugmentQueryTransform
from llama_index.tools.query_engine import QueryEngineTool
# Define your custom prompt template
custom_prompt_template = """
Your custom prompt template goes here.
"""
# Create an instance of BasePromptTemplate with your custom prompt
custom_prompt = BasePromptTemplate(custom_prompt_template)
# Create an instance of SQLAugmentQueryTransform with your custom prompt
sql_augment_query_transform = SQLAugmentQueryTransform(sql_augment_transform_prompt=custom_prompt)
# Create an instance of SQLJoinQueryEngine with your custom prompt
sql_join_query_engine = SQLJoinQueryEngine(
sql_query_tool=QueryEngineTool(...), # Fill in with your SQL query tool
other_query_tool=QueryEngineTool(...), # Fill in with your other query tool
sql_join_synthesis_prompt=custom_prompt,
sql_augment_query_transform=sql_augment_query_transform,
) In this example, replace This will pass your custom prompt template to the For more details, you can refer to the source code of the Regarding the
You can find more details about the I hope this helps! If you have any more questions, feel free to ask. -Dosu SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
suppose i want to pass the table information and some natural language query and corresponding sql queries as custom prompt and while building the NLSQLTableQueryEngine(
|
Beta Was this translation helpful? Give feedback.
-
🤖 To pass a custom prompt template to the from llama_index.prompts import PromptTemplate, PromptType
# Define your custom prompt template
MY_CUSTOM_PROMPT_TMPL = (
"Your custom template here..."
)
# Create a new instance of PromptTemplate with your custom template
my_custom_prompt = PromptTemplate(
MY_CUSTOM_PROMPT_TMPL, prompt_type=PromptType.TEXT_TO_SQL
)
# Pass your custom prompt to the NLSQLTableQueryEngine during its initialization
nlsql_engine = NLSQLTableQueryEngine(prompt_template=my_custom_prompt) Please replace Please note that this is a general guide and the actual code may vary depending on the specific implementation of the The If the Here is the relevant code: def query(self, str_or_query_bundle: QueryType) -> RESPONSE_TYPE:
with self.callback_manager.as_trace("query"):
# if query bundle, just run the query
if isinstance(str_or_query_bundle, QueryBundle):
query_str = str_or_query_bundle.query_str
else:
query_str = str_or_query_bundle
raw_response = self.custom_query(query_str)
return (
Response(raw_response)
if isinstance(raw_response, str)
else raw_response
) The For more details, you can refer to the source code in the LlamaIndex repository: I hope this helps! If you have any more questions, feel free to ask. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
can you give one example to send natural language queries and its corresponding sqls as my CustomPrompt? |
Beta Was this translation helpful? Give feedback.
-
🤖 Sure, I can provide an example of how to send natural language queries and their corresponding SQL queries as a custom prompt. First, you need to define your custom prompt template. This template should include placeholders for the table name, table schema, and the natural language question. Here's an example: CUSTOM_PROMPT_TMPL = (
"You are given a table named: '{table_name}' with schema, "
"generate SQLite SQL query to answer the given question.\n"
"Table schema:\n"
"{table_schema}\n"
"Question: {question}\n\n"
"SQLQuery: "
) Next, you need to create a new instance of the from llama_index.prompts import PromptTemplate, PromptType
CUSTOM_PROMPT = PromptTemplate(
CUSTOM_PROMPT_TMPL, prompt_type=PromptType.TEXT_TO_SQL
) Now, you can pass your custom prompt to the engine = SQLAutoVectorQueryEngine(...)
result = engine.query(query, prompt=CUSTOM_PROMPT) In this example, Please note that the You can find more information about the SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
@dosu-bot |
Beta Was this translation helpful? Give feedback.
-
🤖 Sure, you can pass a list of custom prompt templates to the from llama_index.indices.struct_store.sql_query import BasePromptTemplate, NLSQLTableQueryEngine, SQLDatabase
# Define your custom prompt templates
custom_prompts = [
BasePromptTemplate(
"Given a natural language question, convert it to SQL.\n"
"Question: {query_str}\n"
"SQL: ",
prompt_type=PromptType.TEXT_TO_SQL,
),
BasePromptTemplate(
"Given an input question, synthesize a response from the query results.\n"
"Query: {query_str}\n"
"SQL: {sql_query}\n"
"SQL Response: {context_str}\n"
"Response: ",
prompt_type=PromptType.SQL_RESPONSE_SYNTHESIS,
),
# Add more custom prompt templates as needed
]
# Create a new instance of the NLSQLTableQueryEngine class
query_engine = NLSQLTableQueryEngine(
sql_database=SQLDatabase(...), # replace with your SQLDatabase instance
text_to_sql_prompt=custom_prompts,
synthesize_response=True,
service_context=ServiceContext(...), # replace with your ServiceContext instance
sql_only=False,
verbose=True,
) In this example, Please replace After creating the For more information, you can refer to the LlamaIndex codebase. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
So can i pass both sample natural language query and its corresponding sql query in the prompt template? |
Beta Was this translation helpful? Give feedback.
-
🤖 Based on the information you've provided, it seems like you want to pass a custom prompt template to the Here's an example of how you can do this: from llama_index.indices.struct_store.sql_query import BasePromptTemplate, NLSQLTableQueryEngine, SQLDatabase
# Define your custom prompt template
class MyCustomPromptTemplate(BasePromptTemplate):
template_str = "My custom template: {query_str}, {sql_query}, {context_str}"
prompt_type = "MyCustomPromptType"
# Initialize the query engine with your custom prompt
engine = NLSQLTableQueryEngine(
sql_database=SQLDatabase(...), # provide your SQLDatabase instance
response_synthesis_prompt=MyCustomPromptTemplate(),
# other parameters...
) In this example, If you want to pass multiple custom prompt templates, you can do so by passing them as a list to the from llama_index.prompts import PromptTemplate
# Define your custom prompt templates
text_qa_template_str1 = (
"Context information is"
" below.\n---------------------\n{context_str}\n---------------------\nUsing"
" both the context information and also using your own knowledge, answer"
" the question: {query_str}\nIf the context isn't helpful, you can also"
" answer the question on your own.\n"
)
text_qa_template1 = PromptTemplate(text_qa_template_str1)
text_qa_template_str2 = (
"Additional context information is"
" below.\n---------------------\n{context_str}\n---------------------\nUsing"
" both the additional context information and also using your own knowledge, answer"
" the question: {query_str}\nIf the context isn't helpful, you can also"
" answer the question on your own.\n"
)
text_qa_template2 = PromptTemplate(text_qa_template_str2)
refine_template_str = (
"The original question is as follows: {query_str}\nWe have provided an"
" existing answer: {existing_answer}\nWe have the opportunity to refine"
" the existing answer (only if needed) with some more context"
" below.\n------------\n{context_msg}\n------------\nUsing both the new"
" context and your own knowledge, update or repeat the existing answer.\n"
)
refine_template = PromptTemplate(refine_template_str)
# Pass the templates to the query method
print(
index.as_query_engine(
text_qa_template=[text_qa_template1, text_qa_template2],
refine_template=refine_template
).query("Who is Joe Biden?")
) Please note that the above code assumes that the You can find more information about this in the LlamaIndex documentation. Regarding your question about passing both a sample natural language query and its corresponding SQL query in the prompt template, it's possible if you define your custom prompt template to include both. The class MyCustomPromptTemplate(BasePromptTemplate):
template_str = "My custom template: {query_str}, {sql_query}"
prompt_type = "MyCustomPromptType" In this example, whenever the I hope this helps! If you have any other questions, feel free to ask. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Team
How can i pass custom prompt template to the SQLAutoVectorQueryEngine.query method to as to llm can understand my database well.
Thanks
Beta Was this translation helpful? Give feedback.
All reactions