Replies: 2 comments 3 replies
-
Hello @Jofthomas! |
Beta Was this translation helpful? Give feedback.
3 replies
-
Recently this PR was merged: #4641 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello everyone,
While building an application that use haystack, I'm trying to use Promptnode to answer questions.
The model that I try to use is Flan-UL2 since it should have better results than flan-T5-xl or flan-T5-xxl while being lighter.
My application is dockerised and has the downloaded model from Hugging Face inside of it.
But then when I call the PromptNode with Flan-UL2 ( from" /app/models/flan-ul2" ) it does not work, while I tried with a local flan-T5-xl ( from "/app/models/flan-t5-xl") and it worked.
Hence my question : Is it not possible to use Flan-UL2 for a reason ? Or am I doing something wrong ?
If it does not work I would be glad if you could suggest an alternative model with comparable performance that could be called locally ( no OpenAI model from API )
The code in question :
document_store = InMemoryDocumentStore(use_bm25=True)
preprocessor = PreProcessor(
language="fr",
clean_empty_lines=True,
clean_whitespace=True,
clean_header_footer=False,
split_by="word",
split_length=128,
split_respect_sentence_boundary=True,
)
document_store.delete_documents()
documents_preprocessed = preprocessor.process(documents_raw)
document_store.write_documents(documents_preprocessed)
retriever = BM25Retriever(document_store=document_store)
lfqa_prompt = PromptTemplate(
name="lfqa",
prompt_text="""Synthesize a comprehensive answer from the following text for the given question.
Provide a clear and concise response that summarizes the key points and information presented in the text.
Your answer should be in your own words and be no longer than 50 words.
\n\n Related text: $documents \n\n Question: $query \n\n Answer:""",
)
prompt_node = PromptNode(model_name_or_path="/app/models/flan-ul2", default_prompt_template=lfqa_prompt)
shaper = Shaper(func="join_documents", inputs={"documents": "documents"}, outputs=["documents"])
pipe = Pipeline()
pipe.add_node(component=retriever, name="retriever", inputs=["Query"])
pipe.add_node(component=shaper, name="shaper", inputs=["retriever"])
pipe.add_node(component=prompt_node, name="prompt_node", inputs=["shaper"])
output = pipe.run(query=search_terms)
Beta Was this translation helpful? Give feedback.
All reactions