Skip to content

Support for exporting paligemma to onnx #2329

@DashaMed555

Description

@DashaMed555

Feature request

I’ve tried to export google/paligemma-3b-mix-224 to onnx using optimum. But it outputs: "ValueError: Trying to export a paligemma model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as custom_onnx_configs. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an
issue at https://github.com/huggingface/optimum/issues if you would like the model type paligemma to be supported natively in the ONNX export."

Motivation

I’ve tried everything but nothing works =(
(Using custom configs, using torch.onnx.export, etc)

Your contribution

Actually, it seems to me that I can’t help… =(

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions