Copilot's gpt-5
models give model _not_supported
error.
#2248
-
I was trying to setup the [chat::_submit_http] Error: {
body = '{"error":{"message":"The requested model is not supported.","code":"model_not_supported","param":"model","type":"invalid_request_error"}}',
status = 400
} Now I am unsure what the issue is because #2194's comment suggests to me that this should work now? My config (in case it matters): chat = {
adapter = {
name = "copilot",
model = "gpt-5-codex",
},
tools = {
opts = {
auto_submit_errors = true,
auto_submit_success = true,
},
},
}
``` |
Beta Was this translation helpful? Give feedback.
Answered by
olimorris
Oct 13, 2025
Replies: 1 comment 1 reply
-
Have you enabled it in GitHub? |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
pseudofractal
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Have you enabled it in GitHub?