Custom ComfyUI nodes that super‑charge FluxDev‑style diffusion workflows.
If “prompt engineering” is your Katana, these nodes are the enchanted runes you carve into the blade to make it slice cleaner, faster, smarter.
- Layer‑aware conditioning – target specific Flux blocks (encoder, base, decoder, out) with per‑layer multipliers
- Blend, add, or spatially concat two completely different prompt stacks without losing context
- Built‑in sanity checks & tensor normalization so you stop exploding your VRAM (and your temperament)
- Verbose summaries pushed back to the ComfyUI front‑end via
PromptServer
– know exactly what each node did in plain English - Designed for advanced LoRA / CLIP‑T5 workflows where you juggle dozens of conditioning tensors like a caffeinated circus act
Node list and micro‑pitch below was auto‑extracted from the source 📜
Node | I/O | Elevator Pitch |
---|---|---|
TenosaiFluxConditioningApply |
conditioning → layered conditioning | Apply per‑block strength multipliers to one conditioning list. |
TenosaiFluxConditioningBlend |
cond1 + cond2 → layered conditioning | Blend two conditioning stacks per block with independent ratios. |
TenosaiFluxConditioningSummedBlend |
cond1 + cond2 → single conditioning | Same as above, but outputs a single summed tensor (keeps things slim). |
TenosaiFluxConditioningAddScaledSum |
cond1 + cond2 → single conditioning | Scale cond2 per block, sum it, add onto total cond1. Think “add a spicy topping”. |
TenosaiFluxConditioningSpatialWeightedConcat |
cond1 + cond2 → single conditioning | Concatenate cond1 & scaled‑cond2 along the sequence dimension for spatial hacks. |
# 1. Clone (or submodule) into your ComfyUI custom-nodes folder
git clone https://github.com/YourUsername/tenos-flux-conditioning-nodes.git \
~/ComfyUI/custom_nodes/TenosFluxConditioning
# 2. (Optional) Create a fresh venv & install PyTorch if you somehow skipped that step 🤨
# 3. Fire up ComfyUI and look for the “Tenos.ai/Conditioning” category.
> **Requires**:
>
> * PyTorch 2.1+ (CUDA or ROCm)
> * ComfyUI (latest main branch)
> * A model/scheduler that actually uses Flux blocks (e.g. *FluxDev* or similar)
- Load your prompts using whatever tokenizer node floats your boat.
- Drag in
TenosaiFluxConditioningApply
and connect your conditioning. - Dial in block multipliers (e.g.
encoder = 0.8
,base = 1.2
, etc.). - Feed the output straight into your Flux‑compatible sampler.
- Profit.
CLIPTextEncode → TenosaiFluxConditioningApply → FluxSampler
↑
Adjust multipliers per block in UI
- Each node has tooltips on every slider/checkbox – hover like it’s 1999.
- Output strings are accessible in ComfyUI’s “impact‑node‑feedback” panel; copy‑paste for debugging.
- Shapes are automatically padded to the max sequence length across inputs. No more shape mismatches.