Feature Request: Implement a conversational "Chat Mode" #6082
iFedyna
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Issue (Body):
Is your feature request related to a problem? Please describe.
Yes, the current "Agent Mode" has two major problems when used for simple conversations:
Forced Tool-Use Loop: The system is architected to expect tool usage in every interaction. If a user asks a simple question and the model responds directly without a tool, the system treats this as a failure. It then gets stuck in a loop, repeatedly prompting the model to use a tool, making natural conversation impossible.
Massive Token Inefficiency: Every single message in Agent Mode sends the entire system prompt, which is approximately 9,000 tokens. This is extremely inefficient and costly for simple queries that don't require the full context of all available tools.
Describe the solution you'd like
I propose a new "Chat Mode" designed specifically to solve these issues:
A Valid Conversational Path: This mode will have its own logic that accepts direct, text-only responses from the model as valid, thus preventing the "tool-use error" loop.
Optimized System Prompt: In Chat Mode, a much lighter system prompt of around 600 tokens will be used. This represents a ~93% reduction in token usage for conversational queries, leading to significantly faster response times and lower API costs.
A UI toggle will allow users to seamlessly switch between the powerful "Agent Mode" for tasks and the lightweight "Chat Mode" for discussion.
Additional context
I have already developed a full-stack implementation of this feature. I am ready to create a Pull Request for review as soon as this proposal is approved.
Beta Was this translation helpful? Give feedback.
All reactions