-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Open
Description
Config
OPENAI_MODEL=gpt-3.5-turbo-0125
MAX_TOKENS=16385
Output
2024-07-03 14:39:30,037 - root - INFO - Chat history for chat ID 352569383 is too long. Summarising...
2024-07-03 14:39:34,054 - root - ERROR - This endpoint's maximum context length is 16385 tokens. However, you requested about 17537 tokens (39 of text input, 1113 of tool input, 16385 in the output). Please reduce the length of either one.
Problem: when history too long it summarised to MAX_TOKENS. End then sends to model with MAX_TOKENS+input+tool_input tokens whith is more than MAX_TOKENS
Metadata
Metadata
Assignees
Labels
No labels