-
Notifications
You must be signed in to change notification settings - Fork 205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update for LoRA Adapters: Derived adapters and support for FLUX (#1602) (master) #1652
base: master
Are you sure you want to change the base?
Update for LoRA Adapters: Derived adapters and support for FLUX (#1602) (master) #1652
Conversation
…vinotoolkit#1602) Introducing the concept of a "derived" adapter that allows the creation of pipeline-dependent LoRA naming conventions to original Safetensors adapters, but in a way that is hidden from the user. So, the user continues using original Adapter objects to identify adapters, but internally they are wrapped by "derived" adapter that consists of two parts: (1) original Adapter reference which plays a role of unique adapter identifier, and (2) a postponed derivation action that is applied only once when transformed Adapter tensors are required for the first time. Applied different derivations for SD/SDXL and FLUX. Three naming conventions work for FLUX: original diffusers, Kohya, and XLabs. Still missing: BFL. Other changes: - Ignore the original generation config in LoRA LLMPipeline sample to align with the base greedy sample. - Fix ProgressBar when it is used for the second time which takes place in text2image LoRA sample - Introduce `SharedOptional` class to simplify the code of conditional property modifications based on adapters present and maintain the same copy-on-modify behavior when properties are not replicated when it is not needed. - Split code of LoRA adapters to more files for better readability. - Shorten required prefix for LLM's LoRA to be compatible with a wider set of adapters. (cherry picked from commit 2d71315)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please, update section https://github.com/openvinotoolkit/openvino.genai/blob/master/SUPPORTED_MODELS.md#image-generation-models for Flux + Lora support
@@ -87,6 +88,8 @@ struct OPENVINO_GENAI_EXPORTS AdapterConfig { | |||
float get_alpha(const Adapter& adapter) const; | |||
AdapterConfig& remove(const Adapter&); | |||
const std::vector<Adapter>& get_adapters() const { return adapters; } | |||
std::vector<std::pair<Adapter, float>> get_adapters_and_alphas() const; | |||
void set_adapters_and_alphas(const std::vector<std::pair<Adapter, float>>& adapters); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
please, add Python API bindings
Port of #1602 to master.