Skip to content

Commit f7084fc

Browse files
authored
fix llama4 parallelize syntax bug (#1316)
just to land the fix in #1307 as I'm hoping to publish a release today
1 parent 61ef5cf commit f7084fc

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torchtitan/experiments/llama4/infra/parallelize.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@ def parallelize_llama(
118118
)
119119

120120
# for MoE auxiliary-loss-free load balancing
121-
if parallel_dims.dp_cp_enabled is not None:
121+
if parallel_dims.dp_cp_enabled:
122122
# NOTE: Currently this sync is blocking (thus exposed) and happens on the
123123
# default compute stream. Need to assess if this is OK performance-wise.
124124
dp_cp_mesh = world_mesh["dp_cp"]

0 commit comments

Comments
 (0)