Skip to content

Commit e511ddd

Browse files
authored
[Bug] Fix wrong modescope env set order (#1611)
### What this PR does / why we need it? The `os.environ["VLLM_USE_MODELSCOPE"] = "True"` should be placed before module imports if not ``` The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/xleoken/projects/vllm-ascend/examples/offline_embed.py", line 48, in <module> model = LLM(model="Qwen/Qwen3-Embedding-0.6B", task="embed") File "/usr/local/python3.10.17/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 243, in __init__ self.llm_engine = LLMEngine.from_engine_args( File "/usr/local/python3.10.17/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 494, in from_engine_args vllm_config = engine_args.create_engine_config(usage_context) File "/usr/local/python3.10.17/lib/python3.10/site-packages/vllm/engine/arg_utils.py", line 1018, in create_engine_config model_config = self.create_model_config() File "/usr/local/python3.10.17/lib/python3.10/site-packages/vllm/engine/arg_utils.py", line 910, in create_model_config return ModelConfig( File "/usr/local/python3.10.17/lib/python3.10/site-packages/pydantic/_internal/_dataclasses.py", line 120, in __init__ s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s) File "/usr/local/python3.10.17/lib/python3.10/site-packages/vllm/config.py", line 528, in __post_init__ hf_config = get_config(self.hf_config_path or self.model, File "/usr/local/python3.10.17/lib/python3.10/site-packages/vllm/transformers_utils/config.py", line 321, in get_config config_dict, _ = PretrainedConfig.get_config_dict( File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/configuration_utils.py", line 590, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/configuration_utils.py", line 649, in _get_config_dict resolved_config_file = cached_file( File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/utils/hub.py", line 266, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/utils/hub.py", line 491, in cached_files raise OSError( OSError: We couldn't connect to 'https://huggingface.co' to load the files, and couldn't find them in the cached files. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'. [ERROR] 2025-07-03-15:27:10 (PID:333665, Device:-1, RankID:-1) ERR99999 UNKNOWN applicaiton exception ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Local. Signed-off-by: xleoken <xleoken@163.com>
1 parent a45dfde commit e511ddd

File tree

1 file changed

+1
-3
lines changed

1 file changed

+1
-3
lines changed

examples/offline_embed.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,13 +18,11 @@
1818
#
1919

2020
import os
21+
os.environ["VLLM_USE_MODELSCOPE"] = "True"
2122

2223
import torch
2324
from vllm import LLM
2425

25-
os.environ["VLLM_USE_MODELSCOPE"] = "True"
26-
27-
2826
def get_detailed_instruct(task_description: str, query: str) -> str:
2927
return f'Instruct: {task_description}\nQuery:{query}'
3028

0 commit comments

Comments
 (0)