-
Notifications
You must be signed in to change notification settings - Fork 28.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PeftModel is not an instance of PreTrainedModel. No liger kernels will be applied.
#34016
Comments
cc @BenjaminBossan for the |
I've narrowed the cause down to the changes made in #33502 |
Just to confirm, a
I don't know what exactly liger kernels do, but from the description
they should not interfere with PEFT and should be usable in conjunction. LMK what you think. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Why has this been closed? It's not fixed and it's a very easy change. I've been having to manually do it this whole time on my local transformers version. |
@ambroser53 The issue has been closed due to in inactivity, but as you mentioned, it's still not resolved.
Would you be interested in providing your fix in a PR? |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
System Info
transformers==4.45.1
peft==0.13.0
liger-kernel==0.3.1
So
isinstance(model.base_model.model, PreTrainedModel)
returns true butisinstance(model, PreTrainedModel)
returns false so no liger kernels are used.Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Load any model with a PeftModel wrapper via
get_peft_model
and try to run with theuse_liger_kernel
flag in the trainer.Expected behavior
Apply liger kernels. Be as simple as add a check for peft models that then checks
model.base_model.model
The text was updated successfully, but these errors were encountered: