-
Notifications
You must be signed in to change notification settings - Fork 28.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
passing past_key_values as a tuple is deprecated, but unclear how to resolve #33489
Comments
hey! That's really not on your side, thanks for reporting! |
Maybe we can start removing old cache support from a bunch of models like Llama and clean up for v4.46 release. I didn't dig when exactly the warning appears but I guess it's the eval stage, and since model isn't anymore in A workaround for @RonanKMcGovern can be setting |
Thanks Raushan, yes, correct, by setting use_cache=False I am able to
silence.
…On Tue, Sep 17, 2024 at 8:47 AM Raushan Turganbay ***@***.***> wrote:
Maybe we can start removing old cache support from a bunch of models like
Llama and clean up for v4.46 release. I didn't dig when exactly the warning
appears but I guess it's the eval stage, and since model isn't anymore in
self.training the warning is raised.
A workaround for @RonanKMcGovern <https://github.com/RonanKMcGovern> can
be setting use_cache=False after loading the model, because SFT doesn't
really generate except for a small sample if needed. So it is
model.generation_config.use_cache=False
—
Reply to this email directly, view it on GitHub
<#33489 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ASVG6CSEF5LXK6KQCWWFDHDZW7NBNAVCNFSM6AAAAABOG4ATKKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNJUG44DQNRWGQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
The warning only pops up when a) @zucchini-nlp do you have the bandwidth to update the trainer regarding this warning? 🤗 (since you'll be touching trainer to allow generation to happen) |
@gante actually we don't check for transformers/src/transformers/models/llama/modeling_llama.py Lines 948 to 950 in ac5a055
I think we can force |
@zucchini-nlp doh 🤦 I'm opening a PR asap to only throw this warning in the presence of non-None |
@gante isn't it BC breaking as prev we init empty tuple when EDIT: may bad, warning only when None hehe, gotcha |
I am a simple user of the HF classes and I have absolutely no idea what this warning message, which I receive, is about. |
Yeah sorry it should be gone now on most cases |
System Info
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
transformers
version: 4.44.2Who can help?
@ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
Getting this error:
I don't expect an error here, and it's unclear what I need to update if I'm to use an appropriate
Cache
class.The text was updated successfully, but these errors were encountered: