Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Refactor Hyperopt Tuners (Stage 1) - random tuner #4118

Merged
merged 4 commits into from
Sep 13, 2021

Conversation

liuzhe-lz
Copy link
Contributor

@liuzhe-lz liuzhe-lz commented Aug 27, 2021

The ultimate goal is to get rid of hyperopt dependency. The code which convert formats between nni and hyperopt is nearly impossible to understand.
4 months ago issue 3534 reported a bug about TPE tuner, but I have absolutely no idea how to debug since then.

q: Optional[float] = None
log_distributed: bool = None

def is_activated(self, partial_parameters):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add some comments on what this is for?

Copy link
Contributor Author

@liuzhe-lz liuzhe-lz Aug 30, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added informal doc

from nni.common.hpo_utils import format_search_space, deformat_parameters
from nni.tuner import Tuner

class RandomTuner(Tuner):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this tuner still support nested search space?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe so.

@QuanluZhang QuanluZhang merged commit 5308fd1 into microsoft:master Sep 13, 2021
@liuzhe-lz liuzhe-lz deleted the random branch September 13, 2021 05:56
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants