Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] add a guide for adapter merging. #7110

Closed
wants to merge 5 commits into from

Conversation

sayakpaul
Copy link
Member

What does this PR do?

Content reused from https://huggingface.co/blog/peft_merging. Adapter merging is very popular in the diffusion community. Even though it's experimental, I think having it inside our docs will be beneficial.


There are more examples of different merging methods available in [this post](https://huggingface.co/blog/peft_merging). We encourage you to give them a try 🤗

## AnimateDiff
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DN6 would you like to give this a try? I think that would be very cool!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh nice, I was wondering if it'd be possible to have add_weighted_adapter work with Diffusers! 🚀

The only concern for me is I feel like merging/combining LoRA adapters is too spread out in the docs now (here, here, and here). There are several options (I put a ⭐ for the one I'd recommend), so let me know what you think:

  1. add the add_weighted_adapter method in the Load adapters section
  2. add the add_weighted_adapter method in the Inference with PEFT tutorial
  3. create a new Merge LoRA guide in the Techniques section with set_adapter and add_weighted_adapter and then clean up the Load adapters doc to only show how to load ⭐

@sayakpaul
Copy link
Member Author

sayakpaul commented Feb 28, 2024

add_weighted_adapter() is still quite experimental in nature as stated in the guide. You need to go through a number of steps to be able to get there. It's not as straightforward as using load_lora_weights() and then using set_adapters() (as shown in https://huggingface.co/docs/diffusers/main/en/tutorials/using_peft_for_inference).

So, I don't think we should add add_weighted_adapter() in the legacy docs.

Maybe we could give the guide a different title? "Advanced LoRA Merging"? Plus, we plan on showing the use of add_weighted_adapter() for AnimateDiff too.

Then we can link this guide in the other docs you mentioned.

@sayakpaul
Copy link
Member Author

@stevhliu gentle ping on the above.

@stevhliu
Copy link
Member

Sorry for the delay!

Since it's experimental, I especially don't think it belongs in the Tutorials because I don't think we want to provide an unstable (meaning things can still change) foundation for new users learning how to merge LoRAs. I also think it's not great to introduce two ways of merging LoRAs in the Tutorials because it places extra cognitive work on the learner to choose/distinguish which method to use. We should focus on recommending one general approach in the Tutorial, and at the end link to the other option (whether its add_weighted_adapter or set_adapters).

Let me know what you think! 🙂

@sayakpaul
Copy link
Member Author

Let’s go with option 3 then. Would you be able to repurpose this PR for the same?

@stevhliu
Copy link
Member

Let me start a new PR, it'll be easier for me to refactor everything :)

@sayakpaul
Copy link
Member Author

Sure. Feel free to re-purpose this one. Once that's opened, I will close this one.

@stevhliu stevhliu mentioned this pull request Mar 4, 2024
@sayakpaul
Copy link
Member Author

Closing in light of #7213.

@sayakpaul sayakpaul closed this Mar 5, 2024
@sayakpaul sayakpaul deleted the advanced-adapter-inference branch March 5, 2024 03:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants