Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix FSDP for GPTQ-LoRA and Fused Ops #2

Open
wants to merge 4 commits into
base: add-foak
Choose a base branch
from
Open

Conversation

fabianlim
Copy link
Owner

As foundation-model-stack#15 mentions there is a casting issue when using GPTQ-LoRA and Fused ops.

This issue occurs in fused ops because we bypass the base layer's forward, which we introduced a reinterpret_cast in the above mentioned issue:

  • to resolve, we instead patch the fused ops functions directly.
  • we change patch_forward_to_view_attributes_before_call to allow us to patch multiple submodules, this is needed for fused ops because these fowards trigger on the attention module, not on the linear modules directly.

achew010 and others added 4 commits May 29, 2024 18:10
* workaround low-mem patch

* resolve conflicts and define patch function

* resolve conflicts and define patch function

* Apply suggestions from code review

Co-authored-by: Yu Chin Fabian Lim <fabianlim@users.noreply.github.com>

* revert hack to avoid low memory bug in HF memory metrics calculation

* reversed formatting

* reverse more formatting

---------

Co-authored-by: Yu Chin Fabian Lim <fabianlim@users.noreply.github.com>
…l-stack#27)

* group memory field names with  prefix and minor fixes

* change to drop index on index reset
…oundation-model-stack#25)

* initial commit

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* add fast quantized plugin

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* add mistral and fix plugin

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* add licensing notices and instructions for adding new plugin.

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* handle linting, formatting

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* 2nd round of linting

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* activate workflow and some more lint fixes

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* add sample config

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* updates to benchmark, scenarios

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

* fix tests

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>

---------

Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>
Signed-off-by: Yu Chin Fabian Lim <flim@sg.ibm.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants