Skip to content

Commit 5394ad7

Browse files
[Bugfix] fix KeyError on top logprobs are special tokens (vllm-project#17637)
Signed-off-by: chaunceyjiang <chaunceyjiang@gmail.com>
1 parent 68e1ee0 commit 5394ad7

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

vllm/entrypoints/openai/serving_chat.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1111,7 +1111,8 @@ def _create_chat_logprobs(
11111111
return_as_token_id is not None else self.return_tokens_as_token_ids
11121112
for i, token_id in enumerate(token_ids):
11131113
step_top_logprobs = top_logprobs[i]
1114-
if step_top_logprobs is None:
1114+
if step_top_logprobs is None or step_top_logprobs.get(
1115+
token_id) is None:
11151116
token = tokenizer.decode(token_id)
11161117
if should_return_as_token_id:
11171118
token = f"token_id:{token_id}"

0 commit comments

Comments
 (0)