Replies: 1 comment
-
See PR #8407 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Here's the pertinent code which selectively maps fields from the OpenAI response usage field
langchainjs/libs/langchain-openai/src/llms.ts
Line 343 in 8e12ef9
Currently it only map out three fields:
To be able to track other token details, such as cached tokens, and tokens used for reasoning by reasoning models, more fields must be mapped. Here's the modern openai usage schema:
Motivation
Cached tokens, and tokens used for reasoning by reasoning models are important cost factors for cost tracking and optimization. Without access to these, there is significant disadvantage to using langchainjs.
Proposal (If applicable)
Expand the
TokenUsage
interface, and map the additional fields.I'm planning on forking the code to do this, and I'd be happy to create a PR if desired. I intend to map it to this structure:
Feedback welcome. Thank you.
Beta Was this translation helpful? Give feedback.
All reactions