Skip to content

Commit 05a8b5e

Browse files
committed
Disable loss rounding in training stats log
1 parent 47e795b commit 05a8b5e

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/transformers/trainer.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3004,7 +3004,7 @@ def _maybe_log_save_evaluate(
30043004
# reset tr_loss to zero
30053005
tr_loss -= tr_loss
30063006

3007-
logs["loss"] = round(tr_loss_scalar / (self.state.global_step - self._globalstep_last_logged), 4)
3007+
logs["loss"] = tr_loss_scalar / (self.state.global_step - self._globalstep_last_logged)
30083008
if grad_norm is not None:
30093009
logs["grad_norm"] = grad_norm.item() if isinstance(grad_norm, torch.Tensor) else grad_norm
30103010
if learning_rate is not None:

0 commit comments

Comments
 (0)