Skip to content

Commit 12cc587

Browse files
authored
fix fbcache param (#125)
1 parent cd4e102 commit 12cc587

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

diffsynth_engine/models/flux/flux_dit_fbcache.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -188,7 +188,7 @@ def from_state_dict(
188188
dtype: torch.dtype,
189189
in_channel: int = 64,
190190
attn_kwargs: Optional[Dict[str, Any]] = None,
191-
fb_cache_relative_l1_threshold: float = 0.05,
191+
relative_l1_threshold: float = 0.05,
192192
):
193193
with no_init_weights():
194194
model = torch.nn.utils.skip_init(
@@ -197,7 +197,7 @@ def from_state_dict(
197197
dtype=dtype,
198198
in_channel=in_channel,
199199
attn_kwargs=attn_kwargs,
200-
fb_cache_relative_l1_threshold=fb_cache_relative_l1_threshold,
200+
relative_l1_threshold=relative_l1_threshold,
201201
)
202202
model = model.requires_grad_(False) # for loading gguf
203203
model.load_state_dict(state_dict, assign=True)

0 commit comments

Comments
 (0)