ChatPromptTemplate will automatically add Human:
prefix before the prompt word, anyway to custom?
#18962
-
Checked other resources
Commit to Help
Example Codestreamer = TextIteratorStreamer(
tokenizer=phi2_tk, skip_prompt=False, skip_special_tokens=True, timeout=300.0
)
phi2_pipeline = pipeline(
"text-generation",
tokenizer=phi2_tk,
model=phi,
max_new_tokens=max_new_tokens,
pad_token_id=phi2_tk.eos_token_id,
eos_token_id=phi2_tk.eos_token_id,
device_map="auto",
streamer=streamer,
)
hf_model = HuggingFacePipeline(pipeline=phi2_pipeline)
prompt = ChatPromptTemplate.from_template(
"Instruct: {query}"
)
chain = LLMChain(llm=hf_model, prompt=prompt, return_final_only=False)
query = f"5 tips to keep healthy"
chain.run(query) DescriptionI used LLMChain to build my prompt words, but I found that it would prefix my prompt words with System Infolangchain==0.1.8 |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Hey @CharlinChen, great to see you back! Hope you're doing well. Based on the context provided, it appears that the To prevent LLMChain from automatically adding the The customization or prevention of automatic prefix addition likely occurs either in the prompt loading functions ( If the prefix addition is happening within the prompt loading functions, you would need to modify these functions to either remove the prefix addition or to make the prefix customizable through the configuration. If it's within the LLMChain class, similar modifications would be required. Without the implementation details of the prompt loading functions or the LLMChain class, it's not possible to provide a specific solution. You would need to review the implementations of these components in your codebase to identify where the prefix is being added and make the necessary adjustments. Please note that there was a similar issue Can you make system_prefix customizable? which is now closed. You might find some useful information there. I hope this helps! If you have further questions or need more clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
@CharlinChen It is because of internal |
Beta Was this translation helpful? Give feedback.
@CharlinChen It is because of internal
ChatPromptTemplate
default template. If you only want to pass your query to LLM, you want to usePromptTemplate
fromlangchain_core.prompts.prompt