-
Notifications
You must be signed in to change notification settings - Fork 5
Model Training ‐ Comparison ‐ [Decouple]
Models | Logs | Graphs | Configs
The last optimizer parameter in the comparison.
Google:
Decouple
instructs the optimizer to learnU-Net
andText Encoder
at different rates.
ChatGPT:
Text Encoder
transforms natural language descriptions into numerical representations, typically using word embeddings and sequence encoding.
Habr:
U-Net
performs iterative transformation of noise into the resulting image.
Compared values:
-
true
-B
, -
false
-D
.
Similarly to the previous parameter, it's not entirely clear where this one came from. If you look at the optimizer parameters, you won't find either this parameter or the previous one. Strange.
DLR(step)
Loss(epoch)
Despite the fact that with decouple=false
, the DLR
graph shows a sharp increase, and loss
is the same as in other models, the model at any stage of training does not have any effect on the result.
decouple=true
is a must have optimizer parameter.
- Introduction
- Examples
- Dataset Preparation
- Model Training ‐ Introduction
- Model Training ‐ Basics
- Model Training ‐ Comparison - Introduction
Short Way
Long Way
- Model Training ‐ Comparison - [Growth Rate]
- Model Training ‐ Comparison - [Betas]
- Model Training ‐ Comparison - [Weight Decay]
- Model Training ‐ Comparison - [Bias Correction]
- Model Training ‐ Comparison - [Decouple]
- Model Training ‐ Comparison - [Epochs x Repeats]
- Model Training ‐ Comparison - [Resolution]
- Model Training ‐ Comparison - [Aspect Ratio]
- Model Training ‐ Comparison - [Batch Size]
- Model Training ‐ Comparison - [Network Rank]
- Model Training ‐ Comparison - [Network Alpha]
- Model Training ‐ Comparison - [Total Steps]
- Model Training ‐ Comparison - [Scheduler]
- Model Training ‐ Comparison - [Noise Offset]
- Model Training ‐ Comparison - [Min SNR Gamma]
- Model Training ‐ Comparison - [Clip Skip]
- Model Training ‐ Comparison - [Precision]
- Model Training ‐ Comparison - [Number of CPU Threads per Core]
- Model Training ‐ Comparison - [Checkpoint]
- Model Training ‐ Comparison - [Regularisation]
- Model Training ‐ Comparison - [Optimizer]