Skip to content

【SDXLLongPromptWeightingPipeline load lora】ValueError: Adapter name(s) {'toy'} not in the list of present adapters: {'default_0'}. #10952

@kolar1988

Description

@kolar1988

Describe the bug

SDXLLongPromptWeightingPipeline load lora raise ValueError
ValueError: Adapter name(s) {'toy'} not in the list of present adapters: {'default_0'}.

Diffusers version: 0.32.2

Reproduction

import torch
import os, sys
from diffusers import DiffusionPipeline
import SDXLLongPromptWeightingPipeline
# https://github.com/huggingface/diffusers/blob/v0.32.2/examples/community/lpw_stable_diffusion_xl.py

pipe_id = "stabilityai/stable-diffusion-xl-base-1.0"
# pipe = DiffusionPipeline.from_pretrained(pipe_id, torch_dtype=torch.float16).to("cuda")
pipe = SDXLLongPromptWeightingPipeline.from_pretrained(pipe_id, torch_dtype=torch.float16, add_watermarker=False).to("cuda")

# https://huggingface.co/CiroN2022/toy-face
pipe.load_lora_weights("CiroN2022/toy-face", weight_name="toy_face_sdxl.safetensors", adapter_name="toy")
pipe.set_adapters([ "toy"], adapter_weights=[1.0])

"""
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/venv/lib/python3.10/site-packages/diffusers/loaders/lora_base.py", line 682, in set_adapters
    raise ValueError(
ValueError: Adapter name(s) {'toy'} not in the list of present adapters: {'default_0'}.
"""

# when diffusers==0.30.0,DiffusionPipeline and SDXLLongPromptWeightingPipeline are ok
# But diffusers==0.32.2, DiffusionPipeline is ok. SDXLLongPromptWeightingPipeline throws the error

Logs

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/venv/lib/python3.10/site-packages/diffusers/loaders/lora_base.py", line 682, in set_adapters
    raise ValueError(
ValueError: Adapter name(s) {'toy'} not in the list of present adapters: {'default_0'}.

System Info

  • 🤗 Diffusers version: 0.32.2
  • Platform: Linux-5.4.0-100-generic-x86_64-with-glibc2.35
  • Running on Google Colab?: No
  • Python version: 3.10.12
  • PyTorch version (GPU?): 2.4.1+cu118 (True)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Huggingface_hub version: 0.29.1
  • Transformers version: 4.49.0
  • Accelerate version: 1.1.1
  • PEFT version: 0.14.0
  • Bitsandbytes version: 0.44.1
  • Safetensors version: 0.4.5
  • xFormers version: 0.0.28.post1
  • Accelerator: Tesla V100-PCIE-32GB, 32768 MiB

Who can help?

@sayakpaul @DN6

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions