Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LoRA] Add LoRA support to AuraFlow #10216

Open
wants to merge 31 commits into
base: main
Choose a base branch
from

Conversation

hameerabbasi
Copy link

What does this PR do?

This PR is a simple rebase of #9017

cc @sayakpaul for review.

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@hameerabbasi
Copy link
Author

Thanks for the helping hand, @hlky!

@hlky
Copy link
Collaborator

hlky commented Dec 13, 2024

See https://github.com/huggingface/diffusers/blob/main/tests/lora/test_lora_layers_flux.py https://github.com/huggingface/diffusers/blob/main/tests/lora/test_lora_layers_mochi.py etc as an example for tests. Seems to be missing transformer_cls, get_dummy_inputs and probably others, ping @sayakpaul if you need help with lora tests.

@hameerabbasi
Copy link
Author

hameerabbasi commented Dec 15, 2024

@sayakpaul Okay; I'm at a point where I've got actual, valid test failures but have no idea where to look.
pytest.log

@hameerabbasi
Copy link
Author

Here's the log after the latest commit: pytest.log

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR. I have left some comments to fix a couple of things. LMK if they're unclear.

src/diffusers/loaders/lora_pipeline.py Show resolved Hide resolved
src/diffusers/loaders/lora_pipeline.py Show resolved Hide resolved
src/diffusers/loaders/lora_pipeline.py Outdated Show resolved Hide resolved
src/diffusers/loaders/lora_pipeline.py Show resolved Hide resolved
src/diffusers/loaders/lora_pipeline.py Show resolved Hide resolved
src/diffusers/loaders/lora_pipeline.py Outdated Show resolved Hide resolved
@hameerabbasi
Copy link
Author

Latest test log.

@hameerabbasi
Copy link
Author

@sayakpaul I've uploaded the latest test log in this comment, and commented the difficulties I faced with certain changes. Some pointers would be appreciated.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your work!

I left some more comments. LMK if they make sense.

@hameerabbasi
Copy link
Author

hameerabbasi commented Jan 7, 2025

@sayakpaul Here's the latest test log: pytest.log

Here's the list of failures:

FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_correct_lora_configs_with_different_ranks - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_lora_B_bias - AssertionError: True is not false
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_low_cpu_mem_usage_with_injection - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_low_cpu_mem_usage_with_loading - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_set_adapters_match_attention_kwargs - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_save_pretrained - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_dora - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_partial_text_lora - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_block_scale - AssertionError: True is not false : LoRA weights 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_lora_and_scale - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_lora_save_load - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_lora_unloaded - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter - AssertionError: True is not false : Adapter outputs should be different.
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter_block_lora - AssertionError: True is not false : Adapter 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter_delete_adapter - AssertionError: True is not false : Adapter 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter_weighted - AssertionError: True is not false : Adapter 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_and_scale - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_denoiser_fused - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_denoiser_fused_multi - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_fused - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_save_load - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_unloaded - AssertionError: False is not true
===== 24 failed, 6 passed, 4 skipped, 396 deselected, 1 warning in 24.20s ======

Summary of the latest changes: I copied from Flux mostly as it was the only one with a transformer and one TE. Most of the issues seem to be coming from the line:

self.assertTrue(output_no_lora.shape == self.output_shape)

@hameerabbasi
Copy link
Author

Current pytest.log

Failures:

=========================== short test summary info ============================
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_correct_lora_configs_with_different_ranks - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_lora_B_bias - AssertionError: True is not false
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_low_cpu_mem_usage_with_loading - AttributeError: 'AuraFlowPipeline' object has no attribute 'lora_scale'
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_set_adapters_match_attention_kwargs - AssertionError: True is not false : Lora + scale should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_dora - AssertionError: True is not false : DoRA lora should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_partial_text_lora - ValueError: Target modules {'out_proj', 'v_proj', 'q_proj', 'k_proj'} not found in the base model. Please check the target modules and try again.
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_block_scale - AssertionError: True is not false : LoRA weights 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_lora_and_scale - AssertionError: False is not true : Lora should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_lora_save_load - AttributeError: 'AuraFlowPipeline' object has no attribute 'lora_scale'
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter - AssertionError: True is not false : Adapter outputs should be different.
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter_block_lora - AssertionError: True is not false : Adapter 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter_delete_adapter - AssertionError: True is not false : Adapter 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_denoiser_multi_adapter_weighted - AssertionError: True is not false : Adapter 1 and 2 should give different results
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora - AssertionError: False is not true : Lora should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_and_scale - AssertionError: False is not true : Lora should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_denoiser_fused - AssertionError: True is not false : Fused lora should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_fused - AssertionError: True is not false : Fused lora should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_text_lora_save_load - AttributeError: 'AuraFlowPipeline' object has no attribute 'lora_scale'
===== 18 failed, 12 passed, 4 skipped, 397 deselected, 1 warning in 23.58s =====

@hameerabbasi
Copy link
Author

It seems nothing but CLIP is supported in

def text_encoder_attn_modules(text_encoder):
attn_modules = []
if isinstance(text_encoder, (CLIPTextModel, CLIPTextModelWithProjection)):
for i, layer in enumerate(text_encoder.text_model.encoder.layers):
name = f"text_model.encoder.layers.{i}.self_attn"
mod = layer.self_attn
attn_modules.append((name, mod))
else:
raise ValueError(f"do not know how to get attention modules for: {text_encoder.__class__.__name__}")
return attn_modules
def text_encoder_mlp_modules(text_encoder):
mlp_modules = []
if isinstance(text_encoder, (CLIPTextModel, CLIPTextModelWithProjection)):
for i, layer in enumerate(text_encoder.text_model.encoder.layers):
mlp_mod = layer.mlp
name = f"text_model.encoder.layers.{i}.mlp"
mlp_modules.append((name, mlp_mod))
else:
raise ValueError(f"do not know how to get mlp modules for: {text_encoder.__class__.__name__}")
return mlp_modules

which is called here

for name, _ in text_encoder_attn_modules(text_encoder):
for module in ("out_proj", "q_proj", "k_proj", "v_proj"):
rank_key = f"{name}.{module}.lora_B.weight"
if rank_key not in text_encoder_lora_state_dict:
continue
rank[rank_key] = text_encoder_lora_state_dict[rank_key].shape[1]
for name, _ in text_encoder_mlp_modules(text_encoder):
for module in ("fc1", "fc2"):
rank_key = f"{name}.{module}.lora_B.weight"
if rank_key not in text_encoder_lora_state_dict:
continue
rank[rank_key] = text_encoder_lora_state_dict[rank_key].shape[1]

This means essentially that we need more plumbing to support this for arbitrary text encoders, or to only support the transformer for AuraFlow. This is because AuraFlow only has one UMT5 encoder.

@sayakpaul
Copy link
Member

sayakpaul commented Jan 7, 2025

Thanks, let's support only transformer for the moment, then? @hameerabbasi also let's try to work on the failed tests.

@hameerabbasi
Copy link
Author

So I skipped all tests requiring TE in e06d8eb. Latest failures are: pytest.log

=========================== short test summary info ============================
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_correct_lora_configs_with_different_ranks - AssertionError: False is not true
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_lora_B_bias - AssertionError: True is not false
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_set_adapters_match_attention_kwargs - AssertionError: True is not false : Lora + scale should change the output
FAILED tests/lora/test_lora_layers_auraflow.py::AuraFlowLoRATests::test_simple_inference_with_dora - AssertionError: True is not false : DoRA lora should change the output
==== 4 failed, 10 passed, 20 skipped, 398 deselected, 2 warnings in 15.99s =====

I'm not entirely sure how to get past these.

@sayakpaul
Copy link
Member

Could we try to look into each failure and debug?

@hameerabbasi hameerabbasi force-pushed the auraflow-lora branch 3 times, most recently from b59b25e to 00c921e Compare January 10, 2025 07:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants