Skip to content

Commit 2a5ce0b

Browse files
Fix bug in gradient accumulation implementation
1 parent c5e8513 commit 2a5ce0b

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

unit2/01_finetuning_and_guidance.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -690,7 +690,7 @@
690690
" loss.backward(loss)\n",
691691
"\n",
692692
" # Gradient accumulation:\n",
693-
" if (i + 1) % grad_accumulation_steps == 0:\n",
693+
" if (step + 1) % grad_accumulation_steps == 0:\n",
694694
" optimizer.step()\n",
695695
" optimizer.zero_grad()\n",
696696
"\n",

0 commit comments

Comments
 (0)