We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 1ce41a7 commit c03e12eCopy full SHA for c03e12e
docs/source/user_guide/graph_mode.md
@@ -47,14 +47,14 @@ from vllm import LLM
47
48
os.environ["VLLM_USE_V1"] = 1
49
50
-model = LLM(model="deepseek-ai/DeepSeek-R1-0528", additional_config={"torchair_graph_config": {"enable": True}})
+model = LLM(model="deepseek-ai/DeepSeek-R1-0528", additional_config={"torchair_graph_config": {"enabled": True}})
51
outputs = model.generate("Hello, how are you?")
52
```
53
54
online example:
55
56
```shell
57
-vllm serve Qwen/Qwen2-7B-Instruct --additional-config='{"torchair_graph_config": {"enable": true}}'
+vllm serve Qwen/Qwen2-7B-Instruct --additional-config='{"torchair_graph_config": {"enabled": true}}'
58
59
60
You can find more detail about additional config [here](./additional_config.md)
0 commit comments