Skip to content
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Revert changes
Signed-off-by: Yuanyuan Chen <[email protected]>
  • Loading branch information
cyyever committed Oct 10, 2025
commit 75c3b0c9d27a8b04ebcef4126a05b6465cb81e1c
3 changes: 2 additions & 1 deletion torch/distributed/optim/optimizer.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,8 @@ def step(self, autograd_ctx_id: int):
all_local_grads = dist_autograd.get_gradients(autograd_ctx_id)
# apply functional optimizer step with a list of gradients
grads: list[Optional[Tensor]] = [
all_local_grads.get(p, None) for p in self._local_params
all_local_grads[p] if p in all_local_grads else None # noqa: SIM401
for p in self._local_params
]

self.optim.step(grads)
Expand Down
Loading