Description
Your current environment
vLLM 0.9.1 (Docker)
mistralai/Mistral-Small-3.2-24B-Instruct-2506
Using mistralai/Mistral-Small-3.1-24B-Instruct-2503
with everything else unchanged works.
🐛 Describe the bug
ERROR 06-24 14:40:09 [serving_chat.py:200] Error in preprocessing prompt inputs ERROR 06-24 14:40:09 [serving_chat.py:200] Traceback (most recent call last): ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_chat.py", line 183, in create_chat_completion ERROR 06-24 14:40:09 [serving_chat.py:200] ) = await self._preprocess_chat( ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/openai/serving_engine.py", line 787, in _preprocess_chat ERROR 06-24 14:40:09 [serving_chat.py:200] conversation, mm_data_future = parse_chat_messages_futures( ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1196, in parse_chat_messages_futures ERROR 06-24 14:40:09 [serving_chat.py:200] sub_messages = _parse_chat_message_content( ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1120, in _parse_chat_message_content ERROR 06-24 14:40:09 [serving_chat.py:200] result = _parse_chat_message_content_parts( ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1020, in _parse_chat_message_content_parts ERROR 06-24 14:40:09 [serving_chat.py:200] parse_res = _parse_chat_message_content_part( ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 1077, in _parse_chat_message_content_part ERROR 06-24 14:40:09 [serving_chat.py:200] mm_parser.parse_image(str_content) ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 777, in parse_image ERROR 06-24 14:40:09 [serving_chat.py:200] placeholder = self._tracker.add("image", image_coro) ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/entrypoints/chat_utils.py", line 586, in add ERROR 06-24 14:40:09 [serving_chat.py:200] mm_processor = mm_registry.create_processor(model_config) ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/multimodal/registry.py", line 275, in create_processor ERROR 06-24 14:40:09 [serving_chat.py:200] model_cls = self._get_model_cls(model_config) ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/multimodal/registry.py", line 246, in _get_model_cls ERROR 06-24 14:40:09 [serving_chat.py:200] model_cls, _ = get_model_architecture(model_config) ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/model_executor/model_loader/utils.py", line 240, in get_model_architecture ERROR 06-24 14:40:09 [serving_chat.py:200] model_cls, arch = ModelRegistry.resolve_model_cls(architectures) ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/model_executor/models/registry.py", line 489, in resolve_model_cls ERROR 06-24 14:40:09 [serving_chat.py:200] return self._raise_for_unsupported(architectures) ERROR 06-24 14:40:09 [serving_chat.py:200] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ERROR 06-24 14:40:09 [serving_chat.py:200] File "/workspace/.venv/lib/python3.12/site-packages/vllm/model_executor/models/registry.py", line 426, in _raise_for_unsupported ERROR 06-24 14:40:09 [serving_chat.py:200] raise ValueError( ERROR 06-24 14:40:09 [serving_chat.py:200] ValueError: Model architectures ['PixtralForConditionalGeneration'] failed to be inspected. Please check the logs for more details. INFO: 10.231.0.4:36046 - "POST /v1/chat/completions HTTP/1.1" 400 Bad Request
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.