Are reasoning items of OpenAI returned back when using llama-index agent systems? #19058
Replies: 1 comment 1 reply
-
Yes, LlamaIndex agent systems can return reasoning items from OpenAI when using the newer OpenAI Responses API. The integration is handled through the To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
In this guideline of OpenAI: https://platform.openai.com/docs/guides/reasoning?api-mode=responses
I found this paragraph about using reasoning models with function calling ability
As llamaIndex agent came first and then Response API, I'm not sure how it is related and actually works. The paragraph sound like, "If you gonna use reasoning model with function calling, you should use stateful ResponseAPI to get the best performance".
User can not directly ensure or control the behavior as openAI does not support raw reasoning items, but they are somehow encapsulated in ResponseAPI and their id.
Beta Was this translation helpful? Give feedback.
All reactions