Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

iterative pruner follow-up #3669

Closed
J-shang opened this issue May 25, 2021 · 2 comments
Closed

iterative pruner follow-up #3669

J-shang opened this issue May 25, 2021 · 2 comments

Comments

@J-shang
Copy link
Contributor

J-shang commented May 25, 2021

Need follow-up:

  • check compression doc and update expired content.
  • TaylorFOWeightFilterPruner, ActivationAPoZRankFilterPruner, ActivationMeanRankFilterPruner check these pruners if need multi-epoch training. (Workaround by batch, need refactor)
  • SlimPruner check if we need support multi-iteration.
  • AutoCompressPruner ADMMPruner SimulatedAnnealingPruner check the relationship between these pruner.
  • test patch_optimizer_before() (callback)
  • update ut
  • statistics_batch_num disappear?
  • LotteryTicketPruner need check optimizer
@J-shang
Copy link
Contributor Author

J-shang commented Jun 4, 2021

  • Refactor compress() in IterativePruner, unify support AGP, ADMM.
  • Some pruners have different pruning conditions, not only sparsity, do we need to support?
  • Slim, TaylorFOWeightFilterPruner, ActivationAPoZRankFilterPruner, ActivationMeanRankFilterPruner need real iterative. Related to above item.
  • Add full test (l2filter, Taylor, NetAdapt, SimulatedAnnealing, AutoCompress, AMC, Sensitivity, ADMM, Lottery Ticket)

@J-shang
Copy link
Contributor Author

J-shang commented Jun 4, 2021

  • iterative -> (apply sparsity rate to layer & pruning) / (pruning by other conditions)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant