-
Notifications
You must be signed in to change notification settings - Fork 81
Description
After I used the script flux_slider to train the pt file, I moved the pt file to the lora folder of comfyui. When calling the raw image, the following error occurred:
FETCH DATA from: /tmp/code/ComfyUI/custom_nodes/ComfyUI-Manager/extension-node-map.json [DONE]
got prompt
Using xformers attention in VAE
Using xformers attention in VAE
clip missing: ['text_projection.weight']
model weight dtype torch.float8_e4m3fn, manual cast: torch.bfloat16
model_type FLUX
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_q.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_q.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_q.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_k.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_k.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_k.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_v.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_v.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_v.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_k_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_k_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_k_proj.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_v_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_v_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_v_proj.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_q_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_q_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_add_q_proj.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_out_0.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_out_0.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_out_0.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_add_out.alpha
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_add_out.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_0_attn_to_add_out.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_q.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_q.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_q.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_k.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_k.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_k.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_v.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_v.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_v.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_k_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_k_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_k_proj.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_v_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_v_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_v_proj.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_q_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_q_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_add_q_proj.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_out_0.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_out_0.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_out_0.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_add_out.alpha
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_add_out.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_1_attn_to_add_out.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_q.alpha
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_q.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_q.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_k.alpha
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_k.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_k.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_v.alpha
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_v.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_to_v.lora_up.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_add_k_proj.alpha
lora key not loaded: lora_unet_transformer_blocks_2_attn_add_k_proj.lora_down.weight
lora key not loaded: lora_unet_transformer_blocks_2_attn_