-
Notifications
You must be signed in to change notification settings - Fork 28.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve greedy search memory usage #32895
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
CI failed but it doesn't seem to be related to this PR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, interesting finding! Thanks for handling
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the fix 🙏
(CI is failing for reasons of our knowledge, will take care of rebasing and merging when the root cause is fixed cc @amyeroberts )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks to adding this improvement!
c0169a1
to
77d8384
Compare
77d8384
to
23f74a1
Compare
Do not call torch.repeat_interleave if expand_size is 1
What does this PR do?
When doing greedy search, inputs go through
_expand_inputs_for_generation
here where they are expanded (see here). As the expand size is always 1 in the case of greedy search,torch.repeat_interleave
do not modify the inputs. However, it does increase the memory usage as the input totorch.repeat_interleave
is cloned.Here is a code snippet to check this behaviour:
which returns
Thus, if the expand size is 1, we can return the model inputs before calling
torch.repeat_interleave
. That's the change introduced in this PR.More context in this Slack thread: https://huggingface.slack.com/archives/C01N44FJDHT/p1723827436938589
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.