Replies: 2 comments 4 replies
-
I think that some models like google/gemini-2.5-flash-preview-05-20:thinking cant see multiple messages, they combine them internally into one. while gpt4.1 return the correct answer: blue blue |
Beta Was this translation helpful? Give feedback.
-
The maintaining the chat contiunity (historical messages) may vary for different models (It has nothing to do with OpenRouter). OpenRouter is mainly follows the best practices of well-known LLM providers - OpenAi. This is basically how chat contiunity works - you send you previous conversation along with the new one: $model = 'mistralai/mistral-7b-instruct:free';
$firstMessage = new MessageData(
role: RoleType::USER,
content: 'Is it going to rain today?'
);
$chatData = new ChatData(
messages: [
$firstMessage,
],
model: $model,
);
// This is the chat which you want LLM to remember
$oldResponse = LaravelOpenRouter::chatRequest($chatData); // e.g. LLM responds as: 'No, it is not rainny today'
/*
* You can skip part above and just create your historical message below (maybe you retrieve historical messages from DB etc.)
*/
// Here adding historical response to new message
$historicalMessage = new MessageData(
role: RoleType::ASSISTANT, // Set as assistant since it is a historical message retrieved previously
content: Arr::get($oldResponse->choices[0], 'message.content'), // Historical response content retrieved from previous chat request
);
// This is your new message
$newMessage = new MessageData(
role: RoleType::USER,
content: 'So I should take my umbrella with me?',
);
$chatData = new ChatData(
messages: [
$firstMessage, // this is what we asked
$historicalMessage, // this is what LLM responded
$newMessage, // this is our new message
],
model: $model,
);
$response = LaravelOpenRouter::chatRequest($chatData); Without this, you can manipulate the chat contiunity with prompt-engineering as you do (red red blue blue one). But generally speaking, all LLMs try to be coherent - follow the best practices of well know one (meaning that they generally handles things same/similar to OpenAi).
Anyway, hopefully you figure it out, you are welcome to ask it here, test different scenarios and so on. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
just to make sure as I have few bugs with the models, is the messages array first element is the oldest one in the conversation?
like:
user: hello
agent: hi do you need help
user: yes I need help
when sending it to open router I send it like were 0 is the oldest message? (simplified of course)
$messages = [0=>'user: hello', 1=>'agent: hi do you need help', 2=>'user: yes I need help']
Beta Was this translation helpful? Give feedback.
All reactions