Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[Retiarii] end2end #3122

Merged
merged 61 commits into from
Dec 11, 2020
Merged

Conversation

QuanluZhang
Copy link
Contributor

@QuanluZhang QuanluZhang commented Nov 24, 2020

  • support LayerChoice and InputChoice
  • support new experiment launch approach, i.e., directly launch NAS experiment from python code
  • refactor graph ir: merge input/output name to input/output node; graph dump and load
  • improve graph generation and code generation
    • support darts search space
    • support enas search space
    • support proxylessnas search space
  • support tpe tuner as nas strategy
  • improve debuggability

@QuanluZhang QuanluZhang marked this pull request as ready for review December 5, 2020 14:30
@QuanluZhang QuanluZhang changed the title [Retiarii] support LayerChoice and InputChoice [Retiarii] end2end Dec 5, 2020
elif node.kind() == 'prim::GetAttr':
pass
elif node.kind() == 'prim::Loop':
print('mygraph: ', sm_graph)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove print message

pass
elif node.kind() == 'prim::Loop':
print('mygraph: ', sm_graph)
raise RuntimeError('Loop has not been supported yet!')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens to other prim:: ops?

self.tpe_sampler = TPESampler()
self.model_id = 0

def run(self, base_model, applied_mutators, trainer):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks strange that strategy needs an extra trainer. Why don't we merge trainer into base model?

@@ -1,18 +1,31 @@
MODULE_EXCEPT_LIST = ['Sequential']
RETIARII_BASE_OPS = ['Placeholder']


class Type:
Copy link
Contributor

@liuzhe-lz liuzhe-lz Dec 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why use a class here? What's the behavior of Type()?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will change class name and split this class into two classes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let me refactor this part in the next pr


class LayerChoice(nn.Module):
def __init__(self, candidate_ops: List, label: str = None):
super(LayerChoice, self).__init__()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is Python 2's style. super().__init__() is enough.

@@ -80,3 +94,7 @@ def __init__(self, *args, **kws):
ReLU = wrap_module(nn.ReLU)
Dropout = wrap_module(nn.Dropout)
Linear = wrap_module(nn.Linear)
MaxPool2d = wrap_module(nn.MaxPool2d)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Path of this file should contain "pytorch".

class BaseStrategy(abc.ABC):

@abc.abstractmethod
def run(self, base_model: 'Model', applied_mutators: List['Mutator'], trainer: 'BaseTrainer') -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it only has one method, I don't think there need to be a class.

@@ -17,6 +19,23 @@ class BaseTrainer(abc.ABC):
Trainer has a ``fit`` function with no return value. Intermediate results and final results should be
directly sent via ``nni.report_intermediate_result()`` and ``nni.report_final_result()`` functions.
"""
def __init__(self, *args, **kwargs):
module = self.__class__.__module__
if module is None or module == str.__class__.__module__:
Copy link
Contributor

@liuzhe-lz liuzhe-lz Dec 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't str.__class__.__module__ built-in?

remove_unconnected_nodes(ir_graph, targeted_type='prim::GetAttr')
merge_aten_slices(ir_graph)

def _handle_layerchoice(module):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So you finally decided not to make LayerChoice placeholder?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's refactor this part in the future

@ultmaster
Copy link
Contributor

Suggestions:

  • Rename model_apis to nn.pytorch, like dgl did.
  • Decorator might be a better idea than tca. An example usage from mmcv.

@QuanluZhang
Copy link
Contributor Author

Suggestions:

  • Rename model_apis to nn.pytorch, like dgl did.
  • Decorator might be a better idea than tca. An example usage from mmcv.

good suggestion, i have finished the first one, and leave the second one in the next pr.

@QuanluZhang QuanluZhang merged commit d165905 into microsoft:dev-retiarii Dec 11, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants