```python # LLaVA if model_args.freeze_backbone: model.model.requires_grad_(False) ``` In LTU code, only note the LLM has already frozen. ```python # for audio params, lora always trainable, llama always frozen for name, param in model.named_parameters(): if trainable_params == 'all': if "audio" in name: param.requires_grad = True #print(f"Parameter: {name}, requires_grad: {param.requires_grad}") if trainable_params == 'proj': if "audio_proj" in name: param.requires_grad = True #print(f"Parameter: {name}, requires_grad: {param.requires_grad}") ```