Skip to content

BertLMHeadModel has no attribute 'generate' #218

@allisonandreyev

Description

@allisonandreyev

Tag2Text inference crashes with AttributeError: 'BertLMHeadModel' object has no attribute 'generate'. Repro: results = inference(image_tensor, model, 'None') which reaches tag2text.py:355 and calls self.text_decoder.generate(...). In my env (macOS, Python 3.13, Transformers 4.x), BertLMHeadModel does not expose generate. Is there a required Transformers version, or should the decoder be a generation-capable class like AutoModelForCausalLM or a seq2seq model? If a version pin is intended, please specify. I can test a small patch that guards this call with hasattr(self.text_decoder, "generate").

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions