-
Notifications
You must be signed in to change notification settings - Fork 1.8k
[Retiarii] end2end #3122
[Retiarii] end2end #3122
Conversation
QuanluZhang
commented
Nov 24, 2020
•
edited
Loading
edited
- support LayerChoice and InputChoice
- support new experiment launch approach, i.e., directly launch NAS experiment from python code
- refactor graph ir: merge input/output name to input/output node; graph dump and load
- improve graph generation and code generation
- support darts search space
- support enas search space
- support proxylessnas search space
- support tpe tuner as nas strategy
- improve debuggability
nni/retiarii/converter/graph_gen.py
Outdated
elif node.kind() == 'prim::GetAttr': | ||
pass | ||
elif node.kind() == 'prim::Loop': | ||
print('mygraph: ', sm_graph) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove print message
pass | ||
elif node.kind() == 'prim::Loop': | ||
print('mygraph: ', sm_graph) | ||
raise RuntimeError('Loop has not been supported yet!') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens to other prim::
ops?
test/convert_test/tpe_strategy.py
Outdated
self.tpe_sampler = TPESampler() | ||
self.model_id = 0 | ||
|
||
def run(self, base_model, applied_mutators, trainer): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks strange that strategy needs an extra trainer. Why don't we merge trainer into base model?
@@ -1,18 +1,31 @@ | |||
MODULE_EXCEPT_LIST = ['Sequential'] | |||
RETIARII_BASE_OPS = ['Placeholder'] | |||
|
|||
|
|||
class Type: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why use a class here? What's the behavior of Type()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will change class name and split this class into two classes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let me refactor this part in the next pr
|
||
class LayerChoice(nn.Module): | ||
def __init__(self, candidate_ops: List, label: str = None): | ||
super(LayerChoice, self).__init__() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is Python 2's style. super().__init__()
is enough.
@@ -80,3 +94,7 @@ def __init__(self, *args, **kws): | |||
ReLU = wrap_module(nn.ReLU) | |||
Dropout = wrap_module(nn.Dropout) | |||
Linear = wrap_module(nn.Linear) | |||
MaxPool2d = wrap_module(nn.MaxPool2d) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Path of this file should contain "pytorch".
nni/retiarii/strategy.py
Outdated
class BaseStrategy(abc.ABC): | ||
|
||
@abc.abstractmethod | ||
def run(self, base_model: 'Model', applied_mutators: List['Mutator'], trainer: 'BaseTrainer') -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it only has one method, I don't think there need to be a class.
@@ -17,6 +19,23 @@ class BaseTrainer(abc.ABC): | |||
Trainer has a ``fit`` function with no return value. Intermediate results and final results should be | |||
directly sent via ``nni.report_intermediate_result()`` and ``nni.report_final_result()`` functions. | |||
""" | |||
def __init__(self, *args, **kwargs): | |||
module = self.__class__.__module__ | |||
if module is None or module == str.__class__.__module__: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't str.__class__.__module__
built-in?
remove_unconnected_nodes(ir_graph, targeted_type='prim::GetAttr') | ||
merge_aten_slices(ir_graph) | ||
|
||
def _handle_layerchoice(module): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So you finally decided not to make LayerChoice
placeholder?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's refactor this part in the future