TokenCountingHandler don't work for OpenAIAgent Function Call #19303
Replies: 1 comment 6 replies
-
TokenCountingHandler only tracks tokens for LLM and Embedding events that are emitted through the CallbackManager system. When you use QueryEngineTool, it interacts with an LLM and emits these events, so TokenCountingHandler works as expected. However, FunctionTools usually just wrap Python functions, which don't necessarily call an LLM or emit any LLM events—so TokenCountingHandler doesn't see anything to count and remains empty. If your FunctionTool does internally call an LLM (and uses a properly configured CallbackManager), token usage will be tracked; otherwise, it won't be counted by TokenCountingHandler see code reference. If you need to track token usage for FunctionTools that do invoke LLMs, you can implement custom logic to access the raw OpenAI response and extract token usage, as suggested by maintainers see discussion. For pure Python or non-LLM tools, there simply aren't any tokens to count from the LLM's perspective, so TokenCountingHandler won't show any values. To reply, just mention @dosu. How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
The TokenCountingHandler works fine when I do a question to a OpenAIAgent that has a QueryEngineTool inside and use it about indexed documents. But if I do a question to the same OpenAIAgent and it use only the FunctionTools, the TokenCountingHandler don't have any value. How should I use the TokenCountingHandler in this context? Thank you
Beta Was this translation helpful? Give feedback.
All reactions