Skip to content

Commit 836d4ce

Browse files
[Bugfix] fix missing 'finish_reason': null in streaming chat (#19662)
Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
1 parent c3fec47 commit 836d4ce

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/entrypoints/openai/serving_chat.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -873,7 +873,7 @@ async def chat_completion_stream_generator(
873873
total_tokens=num_prompt_tokens + completion_tokens,
874874
)
875875

876-
data = chunk.model_dump_json(exclude_none=True)
876+
data = chunk.model_dump_json(exclude_unset=True)
877877
yield f"data: {data}\n\n"
878878

879879
# once the final token is handled, if stream_options.include_usage

0 commit comments

Comments
 (0)