Skip to content

Stream last block not return token usage info when create_chat_completion_openai_v1 or create_chat_completion but server does #1984

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
4 tasks done
hh23485 opened this issue Mar 27, 2025 · 1 comment

Comments

@hh23485
Copy link

hh23485 commented Mar 27, 2025

Prerequisites

Stream last block not return token usage info when create_chat_completion_openai_v1 or create_chat_completion but server does.

Python api is better for me, but no token usage returned, is this by disign?

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

Called with

response = llm.create_chat_completion_openai_v1(
        messages=[
            {
                "role": "system",
                "content": "请生成该指令的场景标签和功能标签,并以JSON列表的格式回复。",
            },
            {"role": "user", "content": """<some content>"""},
        ],
        temperature=0,
        max_tokens=192,
        stream=True,
    )

Current Behavior

Image

Failure Information (for bugs)

Set stream_options={"include_usage": True}, seems not work, any other ways?

Steps to Reproduce

Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.

  1. load any model
  2. call with create_chat_completion_openai_v1 or create_chat_completion and enable stream mode
  3. check chunk response
@hh23485
Copy link
Author

hh23485 commented Mar 27, 2025

I think the #1552 is the one I needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant