Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWQ: Patch for mismatched devices in RotaryEmbedding #1480

Merged
merged 1 commit into from
Nov 12, 2024

Conversation

jambayk
Copy link
Contributor

@jambayk jambayk commented Nov 12, 2024

Describe your changes

In transformers>4.43 which is required for models such as llama-3.1 and phi-3.5, there is an open issue related to device mismatch in the rotary embedding module huggingface/transformers#32420
There is no fix yet in either transformers or autoawq so we patch the model adapter in Olive based on the package versions. This unblocks quantization using newer transformers version for llama like models. Fix is based on casper-hansen/AutoAWQ#630

AutoGPTQ has the same issue but is already fixed in main so users can install from source. Also there is no straigtforward way to do a similar patch with it. Tried adding model.rotary_embed to outside_layer_modules but it fails with get_device since the module has no parameters.

Checklist before requesting a review

  • Add unit tests for this change.
  • Make sure all tests can pass.
  • Update documents if necessary.
  • Lint and apply fixes to your code by running lintrunner -a
  • Is this a user-facing change? If yes, give a description of this change to be included in the release notes.
  • Is this PR including examples changes? If yes, please remember to update example documentation in a follow-up PR.

(Optional) Issue link

@jambayk jambayk merged commit 61876e2 into main Nov 12, 2024
25 checks passed
@jambayk jambayk deleted the jambayk/device-map branch November 12, 2024 07:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants