Best way to have different settings for a single model #1946
Closed
NikolaiLyssogor
started this conversation in
Adapters
Replies: 1 comment
-
Have you searched settings in the docs? You can display the hyperparameters at the top of the chat buffer. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
With GPT-5, OpenAI seem to by trying to consolidate towards having a single, versatile model for everything. The new model now has a
verbosity
parameter in addition toreasoning_effort
to control the style and quality of the response.I would like to be able to quickly change between these settings for different chats (and ideally within a single chat). For instance, sometimes I just have a quick syntax question and I only need
reasoning_effort: minimal
andverbosity: low
. The way that I've handled this in the past is by having multiple adapters that extend the OpenAI one, each with different settings. But with there now being many combinations of settings that one would reasonably want to change often, this is becoming a bit cumbersome.One solution to this problem is to use the also new
gpt-5-chat
model that ChatGPT uses. It selects these parameters for you based your question. Ideally though, I would want a bit more control.I would love to know if there is already a better way quickly change settings within CodeCompanion. Thanks to anyone who responds.
Beta Was this translation helpful? Give feedback.
All reactions