How do training_step wrappers affect backprop?
#11722
Unanswered
OverLordGoldDragon
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
and
do not behave the same, despite having identical execution order. Latter no longer reproduces
while former keeps reproducing it no matter how many times
_training_stepis called. Code executes withtorch.use_deterministic_algorithms(True)and samebatchandbatch_idxis passed in each case. Wrapper code appears to modify the model in some way.What are some likely culprits?
Beta Was this translation helpful? Give feedback.
All reactions