Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]Modification of the finetune mechanism #177

Merged
merged 17 commits into from
Jul 11, 2022

Conversation

DavdGao
Copy link
Collaborator

@DavdGao DavdGao commented Jun 21, 2022

This PR is mainly for the modification of the finetune mechansim (#148 ), but we also make small change for other functions as following

Finetune

  • Move partial parameters from cfg.federate into cfg.train as they are more relevant to the training, including
    • local_update_step
    • batch_or_epoch
  • Creat cfg.finetune and cfg.train in the config to support different parameters for finetuning and training (e.g. optimizer.lr)
  • Implement finetune function in the basic trainer
  • Modify most existing shells and yaml files to fit the new setting (except the files under the directory benchmark)

Enums and Decorators

  • Create enums.py to avoid using string and the inconsistency issues
  • Create decorators.py to keep the code clean

Optimizer

To be discussed

@joneswong please check if the following modifications are appropriate

  • In this PR, use_diff is implemented by a decorator use_diff.
  • Some hpo configs are modifed to fit the new configuration.

…ing string; adjust config; adjust trainer and context; create optimizer in the beginning of a routine
# Conflicts:
#	federatedscope/core/trainers/trainer.py
@@ -0,0 +1,20 @@
def use_diff(func):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rayrayraykk Please help me check this modification. Make sure it is consistent with my original implementation. Thanks!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This part looks good to me.

@joneswong joneswong added the enhancement New feature or request label Jul 5, 2022
@DavdGao DavdGao changed the title Modification of the finetune mechanism [Feature]Modification of the finetune mechanism Jul 6, 2022
@DavdGao DavdGao requested a review from Osier-Yi July 7, 2022 02:53
Copy link
Collaborator

@xieyxclack xieyxclack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good job! It might help us to split the functionalities of the trainer. And @yxdyc can double-check the details of the trainer.
Please note that the modification of config (such as cfg.optimizer -> cfg.traine.optimizer) would affect a large range of existing files, and we should make sure the provided examples and the declarations have been updated.

g['lr'] = cfg.personalization.lr
ctx.pFedMe_outer_lr = cfg.optimizer.lr
g['lr'] = ctx.cfg.personalization.lr
ctx.pFedMe_outer_lr = ctx.cfg.train.optimizer.lr
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we have ctx.cfg.xxx ? @yxdyc

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both are ok, as we don't usually change cfg.personalization.lr during the FL courses

@yxdyc
Copy link
Collaborator

yxdyc commented Jul 7, 2022

LGTM. The modification of configs can invalid the scripts of our three FL benchmarks. We can discuss this later that whether we need to make our three benchmarks compatible with FS 0.2.0.

DavdGao added 3 commits July 8, 2022 10:26
# Conflicts:
#	federatedscope/core/auxiliaries/utils.py
#	federatedscope/core/configs/cfg_fl_algo.py
#	federatedscope/core/configs/cfg_fl_setting.py
#	federatedscope/core/configs/cfg_hpo.py
#	federatedscope/core/configs/cfg_training.py
#	federatedscope/core/trainers/context.py
#	federatedscope/core/trainers/torch_trainer.py
#	federatedscope/core/trainers/trainer.py
#	federatedscope/core/worker/client.py
@rayrayraykk
Copy link
Collaborator

LGTM, but please rebase the master to request a new CI-test for format checks, thanks.

@DavdGao
Copy link
Collaborator Author

DavdGao commented Jul 11, 2022

LGTM, but please rebase the master to request a new CI-test for format checks, thanks.

modified accordingly

@xieyxclack xieyxclack self-requested a review July 11, 2022 03:27
Copy link
Collaborator

@xieyxclack xieyxclack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@xieyxclack xieyxclack merged commit 8acc7b6 into alibaba:master Jul 11, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
5 participants