Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding gradient checkpointing to GPT2 #7446

Merged
merged 6 commits into from
Sep 29, 2020
Merged

Adding gradient checkpointing to GPT2 #7446

merged 6 commits into from
Sep 29, 2020

Conversation

TevenLeScao
Copy link
Contributor

@TevenLeScao TevenLeScao commented Sep 29, 2020

This PR adds gradient checkpointing capabilities to GPT-2, imitating the Longformer and Bert checkpointing code. It also disables find_unused_parameters in Trainer if the model is using gradient checkpointing, as per #4659 they are incompatible.

@codecov
Copy link

codecov bot commented Sep 29, 2020

Codecov Report

Merging #7446 into master will decrease coverage by 1.92%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #7446      +/-   ##
==========================================
- Coverage   80.98%   79.06%   -1.93%     
==========================================
  Files         181      181              
  Lines       35750    35757       +7     
==========================================
- Hits        28953    28271     -682     
- Misses       6797     7486     +689     
Impacted Files Coverage Δ
src/transformers/trainer.py 55.70% <ø> (ø)
src/transformers/configuration_gpt2.py 97.36% <100.00%> (+0.07%) ⬆️
src/transformers/modeling_gpt2.py 87.03% <100.00%> (+0.20%) ⬆️
src/transformers/modeling_tf_lxmert.py 22.14% <0.00%> (-72.41%) ⬇️
src/transformers/modeling_rag.py 25.39% <0.00%> (-51.59%) ⬇️
src/transformers/modeling_tf_bert.py 65.26% <0.00%> (-33.64%) ⬇️
src/transformers/modeling_marian.py 60.00% <0.00%> (-30.00%) ⬇️
src/transformers/modeling_longformer.py 74.14% <0.00%> (-18.70%) ⬇️
src/transformers/activations.py 79.16% <0.00%> (-4.17%) ⬇️
src/transformers/tokenization_utils_base.py 90.12% <0.00%> (-3.79%) ⬇️
... and 8 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7dfdf79...6139c24. Read the comment docs.

use_cache=use_cache,
output_attentions=output_attentions,
)
if getattr(self.config, "gradient_checkpointing", False):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think if self.config.gradient_checkpointing: is nicer

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most model configs don't actually have this attribute, only the ones that support checkpointing (AFAIK, Bert and Longformer for now) so it's less risky to do things this way.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is in modeling_gpt2.py which only works with configuration_gpt2.py. So if you add gradient_checkpointing to the config with default = False I don't see why this would be risky

@@ -355,6 +365,10 @@ def test_gpt2_double_lm_head_model(self):
config_and_inputs = self.model_tester.prepare_config_and_inputs()
self.model_tester.create_and_check_double_lm_head_model(*config_and_inputs)

def test_gpt2_gradient_checkpointing(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome that you add a test!

@patrickvonplaten patrickvonplaten self-requested a review September 29, 2020 10:35
Copy link
Contributor

@patrickvonplaten patrickvonplaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. Left mostly nits.

Would be great if you could run RUN_SLOW=1 pytest tests/test_modeling_gpt2.py once to be sure.

Co-authored-by: Patrick von Platen <patrick.v.platen@gmail.com>
@TevenLeScao
Copy link
Contributor Author

The slow tests are passing - I've also added a test for generation with checkpointing, although of course to be sure, one should also check the contents of the backwards pass.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice! Thanks a lot for adding this.

src/transformers/configuration_gpt2.py Outdated Show resolved Hide resolved
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for working on it @TevenLeScao!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants