Skip to content

Commit 571f88f

Browse files
authored
[Doc] Update 0.9.0rc1 release date (#1139)
1. Update 0.9.0rc1 release date 2. Update feature and model support list 3. Add DP known issue to release note Signed-off-by: wangxiyuan <wangxiyuan1007@gmail.com>
1 parent cd2f14a commit 571f88f

File tree

4 files changed

+8
-7
lines changed

4 files changed

+8
-7
lines changed

docs/source/developer_guide/versioning_policy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ Following is the Release Compatibility Matrix for vLLM Ascend Plugin:
3434

3535
| Date | Event |
3636
|------------|-------------------------------------------|
37-
| 2025.06.07 | Release candidates, v0.9.0rc1 |
37+
| 2025.06.09 | Release candidates, v0.9.0rc1 |
3838
| 2025.05.29 | v0.7.x post release, v0.7.3.post1 |
3939
| 2025.05.08 | v0.7.x Final release, v0.7.3 |
4040
| 2025.05.06 | Release candidates, v0.8.5rc1 |

docs/source/user_guide/release_notes.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Release note
22

3-
## v0.9.0rc1 - 2025.06.07
3+
## v0.9.0rc1 - 2025.06.09
44

55
This is the 1st release candidate of v0.9.0 for vllm-ascend. Please follow the [official doc](https://vllm-ascend.readthedocs.io/en/) to start the journey. From this release, V1 Engine is recommended to use. The code of V0 Engine is frozen and will not be maintained any more. Please set environment `VLLM_USE_V1=1` to enable V1 Engine.
66

@@ -36,6 +36,7 @@ This is the 1st release candidate of v0.9.0 for vllm-ascend. Please follow the [
3636
### Known Issue
3737

3838
- In some case, vLLM process may be crashed with aclgraph enabled. We're working this issue and it'll be fixed in the next release.
39+
- Multi node data-parallel doesn't work with this release. This is a known issue in vllm and has been fixed on main branch. [#18981](https://github.com/vllm-project/vllm/pull/18981)
3940

4041
## v0.7.3.post1 - 2025.05.29
4142

docs/source/user_guide/supported_models.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,13 +18,13 @@
1818
| Phi-4-mini || |
1919
| MiniCPM || |
2020
| MiniCPM3 || |
21+
| LLama4 || |
2122
| Mistral | | Need test |
2223
| DeepSeek v2.5 | |Need test |
2324
| Gemma-2 | | Need test |
2425
| Mllama | |Need test|
2526
| Gemma-3 || [#496](https://github.com/vllm-project/vllm-ascend/issues/496) |
2627
| ChatGLM || [#554](https://github.com/vllm-project/vllm-ascend/issues/554) |
27-
| LLama4 || [#471](https://github.com/vllm-project/vllm-ascend/issues/471) |
2828

2929
### Pooling Models
3030
| Model | Supported | Note |

docs/source/user_guide/suppoted_features.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,11 @@ You can check the [support status of vLLM V1 Engine][v1_user_guide]. Below is th
66

77
| Feature | vLLM V0 Engine | vLLM V1 Engine | Next Step |
88
|-------------------------------|----------------|----------------|------------------------------------------------------------------------|
9-
| Chunked Prefill | 🚧 WIP | 🟢 Functional | Functional, see detail note: [Chunked Prefill][cp] |
10-
| Automatic Prefix Caching | 🚧 WIP | 🟢 Functional | Functional, see detail note: [vllm-ascend#732][apc] |
9+
| Chunked Prefill | 🟢 Functional | 🟢 Functional | Functional, see detail note: [Chunked Prefill][cp] |
10+
| Automatic Prefix Caching | 🟢 Functional | 🟢 Functional | Functional, see detail note: [vllm-ascend#732][apc] |
1111
| LoRA | 🟢 Functional | 🟢 Functional | [vllm-ascend#396][multilora], [vllm-ascend#893][v1 multilora] |
12-
| Prompt adapter | 🔴 No plan | 🟡 Planned | Plan in 2025.06.30 |
13-
| Speculative decoding | 🟢 Functional | 🚧 WIP | CI needed; working on V1 support |
12+
| Prompt adapter | 🔴 No plan | 🔴 No plan | This feature has been deprecated by vllm. |
13+
| Speculative decoding | 🟢 Functional | 🟢 Functional | Basic support |
1414
| Pooling | 🟢 Functional | 🟡 Planned | CI needed and adapting more models; V1 support rely on vLLM support. |
1515
| Enc-dec | 🔴 NO plan | 🟡 Planned | Plan in 2025.06.30 |
1616
| Multi Modality | 🟢 Functional | 🟢 Functional | [Tutorial][multimodal], optimizing and adapting more models |

0 commit comments

Comments
 (0)