Replies: 2 comments 1 reply
-
Hold on, it appears that your retriever is missing. When troubleshooting these pipelines, it's advisable to add components one by one to ensure they're functioning as expected. For instance, first confirm that your retriever is producing the desired document outputs. Once that's confirmed, you can add the PromptNode as the final component, feeding it both the query and the documents from the retriever. |
Beta Was this translation helpful? Give feedback.
-
Ok, I see, you are missing join documents part in your PromptTemplate so no documents are injected in the prompt sent to LLM. See example in PromptTemplate class how to do that or read the docs. Also note that top_k of 10 and documents that long are going to overflow your context window. You'll need much more powerful model, like gpt-3.5-turbo to handle so many docs in the context window |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I am creating a generative QA pipeline with PromptNode and I am not getting any results.
==========PYTHON CODE=================
lfqa_prompt = PromptTemplate(
prompt="""Synthesize a comprehensive answer for the given question.
\n\n Question: {query}
\n\n Answer:""",
output_parser=AnswerParser()
)
generator = PromptNode()
question = input("input your question: ")
genPipe = Pipeline()
genPipe.add_node(component=retriever, name="Retriever", inputs=["Query"])
genPipe.add_node(component=generator, name="prompt_node", inputs=["Retriever"])
result = genPipe.run(query=question, params={"Retriever": {"top_k": 10}})
print(result)
========================================
output: {'results': [], 'invocation_context': {'query': 'what is Murabaha?',...}
As you can see, 'results' is empty.
I would appreciate your help on this. Thank you very much in advance.
Beta Was this translation helpful? Give feedback.
All reactions