Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deprecates AdamW and adds --optim #14744

Merged
merged 55 commits into from
Jan 13, 2022
Merged
Changes from 1 commit
Commits
Show all changes
55 commits
Select commit Hold shift + click to select a range
32392af
Add AdamW deprecation warning
manuelciosici Dec 10, 2021
c637f37
Add --optim to Trainer
manuelciosici Dec 10, 2021
b4d0b6d
Update src/transformers/optimization.py
manuelciosici Dec 13, 2021
bcc2408
Update src/transformers/optimization.py
manuelciosici Dec 13, 2021
460eff4
Update src/transformers/optimization.py
manuelciosici Dec 13, 2021
6dc78a6
Update src/transformers/optimization.py
manuelciosici Dec 13, 2021
68dd581
Update src/transformers/training_args.py
manuelciosici Dec 13, 2021
9560350
Update src/transformers/training_args.py
manuelciosici Dec 13, 2021
01f1c7b
Update src/transformers/training_args.py
stas00 Dec 27, 2021
0c79a5f
Merge remote-tracking branch 'origin/master' into deprecate_adamw
stas00 Dec 27, 2021
7ec094f
fix style
stas00 Dec 27, 2021
1c9cccf
fix
stas00 Dec 29, 2021
9807d35
Regroup adamws together
manuelciosici Dec 30, 2021
7a063ab
Change --adafactor to --optim adafactor
manuelciosici Dec 30, 2021
d599a38
Use Enum for optimizer values
manuelciosici Dec 30, 2021
1f9210c
fixup! Change --adafactor to --optim adafactor
manuelciosici Dec 30, 2021
a80b39e
fixup! Change --adafactor to --optim adafactor
manuelciosici Dec 30, 2021
fdf40b2
fixup! Change --adafactor to --optim adafactor
manuelciosici Dec 30, 2021
d5dc69a
Merge branch 'master' into deprecate_adamw
manuelciosici Dec 30, 2021
0acba0c
fixup! Use Enum for optimizer values
manuelciosici Dec 30, 2021
2b7d9dd
Improved documentation for --adafactor
manuelciosici Dec 31, 2021
7c3139a
Add mention of no_deprecation_warning
manuelciosici Dec 31, 2021
234f7d1
Rename OptimizerOptions to OptimizerNames
manuelciosici Dec 31, 2021
1786d42
Use choices for --optim
manuelciosici Dec 31, 2021
210ed37
Move optimizer selection code to a function and add a unit test
manuelciosici Dec 31, 2021
7e62da9
Change optimizer names
manuelciosici Dec 31, 2021
0e7f955
Rename method
manuelciosici Jan 1, 2022
12a9e37
Rename method
manuelciosici Jan 1, 2022
c5853b0
Remove TODO comment
manuelciosici Jan 1, 2022
d59aa52
Rename variable
manuelciosici Jan 1, 2022
e7ffd71
Rename variable
manuelciosici Jan 1, 2022
b64fc03
Rename function
manuelciosici Jan 1, 2022
c5b5443
Rename variable
manuelciosici Jan 1, 2022
91aff78
Parameterize the tests for supported optimizers
manuelciosici Jan 1, 2022
f3505db
Refactor
manuelciosici Jan 1, 2022
91c35f2
Attempt to make tests pass on CircleCI
manuelciosici Jan 1, 2022
bcd8a0d
Add a test with apex
manuelciosici Jan 2, 2022
f8cb39c
rework to add apex to parameterized; add actual train test
stas00 Jan 2, 2022
98f0f2f
fix import when torch is not available
stas00 Jan 2, 2022
eba41bd
fix optim_test_params when torch is not available
stas00 Jan 2, 2022
aaee305
fix optim_test_params when torch is not available
stas00 Jan 2, 2022
071198c
re-org
stas00 Jan 2, 2022
182dac8
small re-org
stas00 Jan 2, 2022
2b46361
fix test_fused_adam_no_apex
stas00 Jan 2, 2022
470a1d7
Update src/transformers/training_args.py
manuelciosici Jan 12, 2022
cb85474
Update src/transformers/training_args.py
manuelciosici Jan 12, 2022
b2675f8
Update src/transformers/training_args.py
manuelciosici Jan 12, 2022
1e8acec
Remove .value from OptimizerNames
manuelciosici Jan 12, 2022
b32a194
Rename optimizer strings s|--adam_|--adamw_|
manuelciosici Jan 12, 2022
b839e80
Also rename Enum options
manuelciosici Jan 12, 2022
e73249c
small fix
stas00 Jan 12, 2022
7ac8dc0
Fix instantiation of OptimizerNames. Remove redundant test
manuelciosici Jan 12, 2022
a2363cd
Use ExplicitEnum instead of Enum
manuelciosici Jan 12, 2022
ea02877
Add unit test with string optimizer
manuelciosici Jan 12, 2022
ec92011
Change optimizer default to string value
manuelciosici Jan 13, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Update src/transformers/training_args.py
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
  • Loading branch information
manuelciosici and sgugger authored Jan 12, 2022
commit b2675f828298436af8f8394fb35a4f774b99f4b6
2 changes: 1 addition & 1 deletion src/transformers/training_args.py
Original file line number Diff line number Diff line change
Expand Up @@ -832,7 +832,7 @@ def __post_init__(self):
"`--adafactor` is deprecated and will be removed in version 5 of 🤗 Transformers. Use `--optim adafactor` instead",
FutureWarning,
)
self.optim = OptimizerNames.ADAFACTOR.value
self.optim = OptimizerNames.ADAFACTOR

if (
is_torch_available()
Expand Down