Skip to content

Import Error!!! #232

@Islander-0v0-wxin

Description

@Islander-0v0-wxin

INFO 11-07 06:43:30 [importing.py:53] Triton module has been replaced with a placeholder.
INFO 11-07 06:43:31 [init.py:239] Automatically detected platform cuda.

ModuleNotFoundError Traceback (most recent call last)
/tmp/ipython-input-720217358.py in <cell line: 0>()
1 from vllm import LLM, SamplingParams
----> 2 from vllm.model_executor.models.deepseek_ocr import NGramPerReqLogitsProcessor
3 from PIL import Image
4
5 # Create model instance

ModuleNotFoundError: No module named 'vllm.model_executor.models.deepseek_ocr'

I tried installations with many different setups including
pip install torch==2.8.0 torchvision==0.23.0 torchaudio==2.8.0 --index-url https://download.pytorch.org/whl/cu126
pip install vllm==0.8.5
pip install -r requirements.txt
pip install flash-attn==2.7.3 --no-build-isolation

And

pip install torch==2.6.0 torchvision==0.21.0 torchaudio==2.6.0 --index-url https://download.pytorch.org/whl/cu118
pip install vllm-0.8.5+cu118-cp38-abi3-manylinux1_x86_64.whl
pip install -r requirements.txt
pip install flash-attn==2.7.3 --no-build-isolation
as well as uv pip install -U vllm --pre --extra-index-url https://wheels.vllm.ai/nightly

But None of them works. It always showed me
ModuleNotFoundError: No module named 'vllm.model_executor.models.deepseek_ocr'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions