[langchain] token counts missing for gemini + langchain #1391
Labels
bug
Something isn't working
help wanted
Extra attention is needed
language: python
Related to Python integration
see https://arize-ai.slack.com/archives/C04R3GXC8HK/p1742373744925739
We might not be looking at the right place for token counts.
The text was updated successfully, but these errors were encountered: