You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Correct me if I'm wrong but ... It seems that when deepspeed is enabled, only in the first prepare that the dataloader is prepared ... after the first prepare, I guess the train_micro_batch_size_per_gpu will not be 'auto', thus, the original dataloader2 is returned.
Is it a bug or it is expected and this has been mentioned somewhere in the doc ?
Thank you,
Trung
The text was updated successfully, but these errors were encountered:
Hi, I recently encounter this:
accelerate/src/accelerate/accelerator.py
Line 1154 in 03754c1
Correct me if I'm wrong but ... It seems that when deepspeed is enabled, only in the first prepare that the dataloader is prepared ... after the first prepare, I guess the
train_micro_batch_size_per_gpu
will not be 'auto', thus, the original dataloader2 is returned.Is it a bug or it is expected and this has been mentioned somewhere in the doc ?
Thank you,
Trung
The text was updated successfully, but these errors were encountered: