You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.
What would you like to be added:
Add some configurations that user can decide how many trials can be run on same GPU.
For example,
MaxMemOnGPU: 2G -- user to declare how many mem of GPU is used in on trial. It could be used to
MaxTrialsNumOnGPU: 3
EnableMultipleTrialsOnGPU: true
Why is this needed: It can improve utilization in some cases, that GPU mem is free a lot, and there is not a single bottleneck on IO or computing.
Without this feature, how does current nni work:
schedule 1 trial on one GPU.
Components that may involve changes:
Brief description of your proposal if any:
The text was updated successfully, but these errors were encountered:
What would you like to be added:
Add some configurations that user can decide how many trials can be run on same GPU.
For example,
MaxMemOnGPU: 2G -- user to declare how many mem of GPU is used in on trial. It could be used to
MaxTrialsNumOnGPU: 3
EnableMultipleTrialsOnGPU: true
Why is this needed: It can improve utilization in some cases, that GPU mem is free a lot, and there is not a single bottleneck on IO or computing.
Without this feature, how does current nni work:
schedule 1 trial on one GPU.
Components that may involve changes:
Brief description of your proposal if any:
The text was updated successfully, but these errors were encountered: