Skip to content

Commit

Permalink
Do not prepare lr scheduler as it as the right number of steps (huggi…
Browse files Browse the repository at this point in the history
…ngface#24088)

* Do not prepare lr scheduler as it as the right number of steps

* Trigger CI

* Trigger CI

* Trigger CI

* Add fake comment

* Remove fake comment

* Trigger CI please!
  • Loading branch information
sgugger authored and novice03 committed Jun 23, 2023
1 parent 4b9ec1b commit b6241e2
Showing 1 changed file with 2 additions and 3 deletions.
5 changes: 2 additions & 3 deletions src/transformers/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -1747,9 +1747,7 @@ def _inner_training_loop(

# prepare using `accelerator` prepare
if use_accelerator_prepare:
model, self.optimizer, self.lr_scheduler = self.accelerator.prepare(
self.model, self.optimizer, self.lr_scheduler
)
model, self.optimizer = self.accelerator.prepare(self.model, self.optimizer)

if self.is_fsdp_enabled:
self.model = model
Expand Down Expand Up @@ -1996,6 +1994,7 @@ def _inner_training_loop(
optimizer_was_run = scale_before <= scale_after
else:
self.optimizer.step()
optimizer_was_run = not self.accelerator.optimizer_step_was_skipped

if optimizer_was_run:
# Delay optimizer scheduling until metrics are generated
Expand Down

0 comments on commit b6241e2

Please sign in to comment.