-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Conversation
suiguoxin
commented
Jul 10, 2020
•
edited
Loading
edited
- Reuse class doc string in "Read the Docs" as in NAS
- remove redundant parameter 'optimizer ' from Pruners ('level', 'slim', 'fpgm', 'l1', 'l2') and change related unit test accordingly;
- fix some typos
bc039ef
to
3f0eb81
Compare
docs/en_US/Compressor/Pruner.md
Outdated
|
||
- **sparsity:** How much percentage of convolutional filters are to be pruned. | ||
- **op_types:** Currently only Conv2d is supported in TaylorFOWeightFilterPruner. | ||
#### User configuration for TaylorFOWeightFilterPruner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TaylorFOWeightFilterPruner -> TaylorFOWeightFilter Pruner
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
docs/en_US/Compressor/Pruner.md
Outdated
@@ -223,16 +246,18 @@ pruner.compress() | |||
|
|||
Note: ActivationAPoZRankFilterPruner is used to prune convolutional layers within deep neural networks, therefore the `op_types` field supports only convolutional layers. | |||
|
|||
You can view example for more information | |||
You can view [example](https://github.com/microsoft/nni/blob/master/examples/model_compress/model_prune_torch.py) for more information. | |||
|
|||
### User configuration for ActivationAPoZRankFilterPruner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ActivationAPoZRankFilterPruner -> ActivationAPoZRankFilter Pruner
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
docs/en_US/Compressor/Pruner.md
Outdated
##### PyTorch | ||
|
||
```eval_rst | ||
.. autoclass:: nni.compression.torch.AGP_Pruner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could you help change all AGP_Pruner
to AGPPruner
? this pruner's name is different from all others...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
docs/en_US/Compressor/Pruner.md
Outdated
##### PyTorch | ||
|
||
```eval_rst | ||
.. autoclass:: nni.compression.torch.ADMMPruner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is not ADMMPruner
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
optimizer: torch.optim.Optimizer | ||
Optimizer used to train model. | ||
pruning_algorithm: str | ||
Algorithms being used to prune model. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
better to list all the options here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
Supported keys: | ||
- sparsity : This is to specify the sparsity operations to be compressed to. | ||
- op_types : Only BatchNorm2d is supported in Slim Pruner. | ||
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this docstring is put above __init__
, @chicm-ms @liuzhe-lz do you think this is a good practice?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is to keep consistent with NAS classes and generate correct docstring in readthedocs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a good practice since __init__
is the constructor of the class, I checked pytorch and tensorflow, they put docstring of class before __init__
, including the parameters of __init__
. I was not aware of this before. For example: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/engine/base_layer.py#L114
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
got it, thanks.
""" | ||
Parameters | ||
---------- | ||
model : torch.nn.module |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
torch.nn.module should be torch.nn.Module
Base pruning algorithm. `level`, `l1` or `l2`, by default `l1`. Given the sparsity distribution among the ops, | ||
the assigned `base_algo` is used to decide which filters/channels/weights to prune. | ||
start_temperature : float | ||
Simualated Annealing related parameter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggest to provide more description for start_temperature
, stop_temperature
, and cool_down_rate
- sparsity : This is to specify the sparsity operations to be compressed to. | ||
- op_types : Operation types to prune. | ||
""" | ||
def __init__(self, model, config_list): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optimizer=None
should not be removed. The example does not work without this parameter.