You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.
While running one-shot NAS experiments on custom datasets, I encountered some inconsistencies in the usage of .to(device) and to_device().
EnasTrainer, on the one hand, uses the to_device() function from nni.retiarii.oneshot.pytorch.utils to transfer the tensor to the specified device. DartsTrainer and ProxylessTrainer, on the other hand, use the standard torch .to() function.
Using .to() is problematic, as the torch.utils.data.DataLoader instances might return batches that are not of type torch.Tensor and hence have no .to() method. The to_device function handles these cases.
Would it be possible to use to_device() across all Trainer classes?
Environment:
NNI version: 2.3
Python version: 3.8.8
PyTorch version: 1.9.0
Thank you,
Thomas
The text was updated successfully, but these errors were encountered:
As I said in #3956, we will probably handle the device issue with a more general framework like PyTorch-lightning. Before this refactor, we encourage you to change all .to() to to_device() in a PR. Thanks in advance.
Describe the issue:
Hello!
While running one-shot NAS experiments on custom datasets, I encountered some inconsistencies in the usage of
.to(device)
andto_device()
.EnasTrainer, on the one hand, uses the
to_device()
function fromnni.retiarii.oneshot.pytorch.utils
to transfer the tensor to the specified device. DartsTrainer and ProxylessTrainer, on the other hand, use the standard torch.to()
function.Using
.to()
is problematic, as thetorch.utils.data.DataLoader
instances might return batches that are not of typetorch.Tensor
and hence have no.to()
method. Theto_device
function handles these cases.Would it be possible to use
to_device()
across all Trainer classes?Environment:
Thank you,
Thomas
The text was updated successfully, but these errors were encountered: