Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace train_dataloader and evaluators on state with just dataloader #303

Closed
ravi-mosaicml opened this issue Jan 31, 2022 · 2 comments
Closed

Comments

@ravi-mosaicml
Copy link
Contributor

ravi-mosaicml commented Jan 31, 2022

  1. Algorithms should NOT modify the dataloader to do data augmentations; instead they should use the after batch event
  2. Algorithms and callbacks need to reference only the current dataloader -- they don't need to operate on a dataloader that is not being used
  3. It may be possible to remove the dataloader from the state entirely. However, this will require Time Abstraction #146 to be fully implemented.
@hanlint
Copy link
Contributor

hanlint commented Jan 31, 2022

For #1, we have many algorithms that do modify the dataloader, because we want them to run on CPU. after_batch is only for GPU-based augmentations.

@ravi-mosaicml
Copy link
Contributor Author

Closing for #329, since that will remove the evaluators from the state. Algorithms will require the train dataloader to be on the state.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants