-
Notifications
You must be signed in to change notification settings - Fork 310
Open
Description
Tag2Text inference crashes with AttributeError: 'BertLMHeadModel' object has no attribute 'generate'
. Repro: results = inference(image_tensor, model, 'None')
which reaches tag2text.py:355
and calls self.text_decoder.generate(...)
. In my env (macOS, Python 3.13, Transformers 4.x), BertLMHeadModel
does not expose generate
. Is there a required Transformers version, or should the decoder be a generation-capable class like AutoModelForCausalLM
or a seq2seq model? If a version pin is intended, please specify. I can test a small patch that guards this call with hasattr(self.text_decoder, "generate")
.
Metadata
Metadata
Assignees
Labels
No labels