How can I give a System Message to LLM? #20407
Replies: 2 comments 1 reply
-
To integrate a system message into LangChain's workflow with the
Here's a snippet showing how to implement this: from langchain_core.messages import SystemMessage, HumanMessage
from langchain_openai import ChatOpenAI
# Initialize your ChatOpenAI model
model = ChatOpenAI(model="gpt-3.5-turbo")
# Create a system message
system_message = SystemMessage(content="You are a helpful assistant.")
# Create a human message (replace "Hello" with your actual query)
human_message = HumanMessage(content="Hello")
# Pass both messages to the model
response = model([system_message, human_message])
# Process the response
answers = response.content This approach allows the Sources
|
Beta Was this translation helpful? Give feedback.
-
Take a look at https://python.langchain.com/docs/modules/model_io/prompts/quick_start/#chatprompttemplate |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Hey everyone, I have a problem with giving a system message to LLm using langchain, we can do this easily with OpenAi API like this:{"role": "system", "content": "You are a helpful assistant."},
But how can I apply it using LangChain?
System Info
.
Beta Was this translation helpful? Give feedback.
All reactions