Skip to content

Commit b2d9be6

Browse files
authored
[Docs] Remove WIP features in V1 guide (#19498)
Signed-off-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>
1 parent 04a5561 commit b2d9be6

File tree

1 file changed

+1
-9
lines changed

1 file changed

+1
-9
lines changed

docs/usage/v1_guide.md

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ For a complete list of supported models, see the [list of supported models](http
103103
| **LoRA** | <nobr>🚀 Optimized</nobr> |
104104
| **Logprobs Calculation** | <nobr>🟢 Functional</nobr> |
105105
| **FP8 KV Cache** | <nobr>🟢 Functional on Hopper devices ([PR #15191](https://github.com/vllm-project/vllm/pull/15191))</nobr>|
106-
| **Spec Decode** | <nobr>🚧 WIP ([PR #13933](https://github.com/vllm-project/vllm/pull/13933))</nobr>|
106+
| **Spec Decode** | <nobr>🚀 Optimized</nobr> |
107107
| **Prompt Logprobs with Prefix Caching** | <nobr>🟡 Planned ([RFC #13414](https://github.com/vllm-project/vllm/issues/13414))</nobr>|
108108
| **Structured Output Alternative Backends** | <nobr>🟢 Functional</nobr> |
109109
| **Request-level Structured Output Backend** | <nobr>🔴 Deprecated</nobr> |
@@ -137,14 +137,6 @@ Support for logprobs with post-sampling adjustments is in progress and will be a
137137

138138
Currently prompt logprobs are only supported when prefix caching is turned off via `--no-enable-prefix-caching`. In a future release, prompt logprobs will be compatible with prefix caching, but a recomputation will be triggered to recover the full prompt logprobs even upon a prefix cache hit. See details in [RFC #13414](https://github.com/vllm-project/vllm/issues/13414).
139139

140-
#### WIP Features
141-
142-
These features are already supported in vLLM V1, but their optimization is still
143-
in progress.
144-
145-
- **Spec Decode**: Currently, only ngram-based spec decode is supported in V1. There
146-
will be follow-up work to support other types of spec decode (e.g., see [PR #13933](https://github.com/vllm-project/vllm/pull/13933)). We will prioritize the support for Eagle, MTP compared to draft model based spec decode.
147-
148140
#### Deprecated Features
149141

150142
As part of the major architectural rework in vLLM V1, several legacy features have been deprecated.

0 commit comments

Comments
 (0)