Skip to content

Question: Where is the setting to freeze the backbone LLM like LLaVA? #47

@Kurt232

Description

@Kurt232
# LLaVA
if model_args.freeze_backbone:
        model.model.requires_grad_(False)

In LTU code, only note the LLM has already frozen.

    # for audio params, lora always trainable, llama always frozen
    for name, param in model.named_parameters():
        if trainable_params == 'all':
            if "audio" in name:
                param.requires_grad = True
                #print(f"Parameter: {name}, requires_grad: {param.requires_grad}")
        if trainable_params == 'proj':
            if "audio_proj" in name:
                param.requires_grad = True
                #print(f"Parameter: {name}, requires_grad: {param.requires_grad}")

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions