ConversationChain Not Running Properly #9169
Replies: 4 comments 2 replies
-
My code is not displaying properly above and for some reason I am not able to edit the formatting
|
Beta Was this translation helpful? Give feedback.
-
🤖 Hello, Thank you for reaching out and providing detailed information about the issue you're facing. The error message you're seeing, In your case, it seems like the prompt is not expecting any input variables ( The To resolve this issue, you could modify your from langchain.schema import AIMessage, HumanMessage, SystemMessage
from langchain.prompts import ChatPromptTemplate, HumanMessagePromptTemplate
import os
from apikey import apikey
from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain
os.environ["OPENAI_API_KEY"] = apikey
chat_model = ChatOpenAI(openai_api_key=apikey)
system_message = SystemMessage(content="Act as a Recruiter for a Big Tech Startup")
human_message = HumanMessagePromptTemplate.from_template("{history}")
prompt = ChatPromptTemplate(messages=[system_message, human_message])
conversation = ConversationChain(llm=chat_model, prompt=prompt, verbose=True)
def input(human_response):
return conversation.run(human_response) In this code, the Please try this solution and let me know if it resolves your issue or if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
I'm facing the same problem and can't get prompts working with ConversationChain. Adding history into HumanMessagePromptTemplate leads to same error.
EDIT: HumanMessagePromptTemplate requires both |
Beta Was this translation helpful? Give feedback.
-
Just wanted to point out that this issue is another example that Langchain looks cool on the outside but if you try to do something more complex then the nasty things blow up. It'd be nice if anyone from Langchain tracks and responds to such bugs. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
This is the code snippet I am trying to run
`from langchain.schema import AIMessage, HumanMessage, SystemMessage
from langchain.prompts import ChatPromptTemplate
import os
from apikey import apikey
from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain
os.environ["OPENAI_API_KEY"] = apikey
chat_model = ChatOpenAI(openai_api_key=apikey)
system_message = SystemMessage(content="Act as a Recruiter for a Big Tech Startup")
prompt = ChatPromptTemplate(messages=[system_message])
conversation = ConversationChain(llm=chat_model,prompt=prompt, verbose=True)
def input(human_response):
return conversation.run(human_response)
`
And I am facing the below error
ValidationError: 1 validation error for ConversationChain __root__ Got unexpected prompt input variables. The prompt expects [], but got ['history'] as inputs from memory, and input as the normal input key. (type=value_error)
The documentation is also not super robust for conversation chain, and I am really not able to figure out what can I do to resolve this as conversation chain has auto memory handling, really not sure where I am going wrong,
It works fine without using the prompt and system message.
Beta Was this translation helpful? Give feedback.
All reactions