openai-native.ts #2846
Fastidius-au
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
App Version
current codebase is using completions
API Provider
OpenAI
Model Used
GPT4.1s or any multimodal ongoing
Actual vs. Expected Behavior
https://github.com/RooVetGit/Roo-Code/blob/main/src/api/providers/openai-native.ts
From release emaila advice
While they’re available in both the Chat Completions and Responses APIs, for the richest experience, we recommend the Responses API. It supports reasoning summaries—the model’s thoughts stream while you wait for the final response—and enables smarter tool use by preserving the model’s prior reasoning between calls.
https://platform.openai.com/docs/api-reference/responses/object
seems like this is the new way and you need a way to deal better now that 4.1 is likely the go forward coders for a little while....can we just have a toggle for both options in the model select area. I expect architect is response and code is completions but feel free to discuss .
Detailed Steps to Reproduce
Relevant API Request Output
Additional Context
No response
Beta Was this translation helpful? Give feedback.
All reactions