Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v22.6.2 #1995

Merged
merged 91 commits into from
Feb 24, 2024
Merged

v22.6.2 #1995

Changes from 1 commit
Commits
Show all changes
91 commits
Select commit Hold shift + click to select a range
6849546
add gradual latent
kohya-ss Nov 23, 2023
610566f
Update README.md
kohya-ss Nov 23, 2023
2897a89
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Nov 26, 2023
298c6c2
fix gradual latent cannot be disabled
kohya-ss Nov 26, 2023
2c50ea0
apply unsharp mask
kohya-ss Nov 27, 2023
29b6fa6
add unsharp mask
kohya-ss Nov 28, 2023
2952bca
fix strength error
kohya-ss Dec 1, 2023
7a4e507
add target_x flag (not sure this impl is correct)
kohya-ss Dec 3, 2023
e8c3a02
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Dec 7, 2023
9278031
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Dec 11, 2023
07ef03d
fix controlnet to work with gradual latent
kohya-ss Dec 11, 2023
d61ecb2
enable comment in prompt file, record raw prompt to metadata
kohya-ss Dec 11, 2023
da9b34f
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Jan 4, 2024
2e4bee6
Log accelerator device
akx Jan 16, 2024
afc3870
Refactor memory cleaning into a single function
akx Jan 16, 2024
478156b
Refactor device determination to function; add MPS fallback
akx Jan 16, 2024
8f6f734
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Jan 27, 2024
ccc3a48
Update IPEX Libs
Disty0 Jan 28, 2024
d4b9568
fix broken import in svd_merge_lora script
mgz-dev Jan 28, 2024
988dee0
IPEX torch.tensor FP64 workaround
Disty0 Jan 29, 2024
9d7729c
Merge pull request #1086 from Disty0/dev
kohya-ss Jan 31, 2024
7f948db
Merge pull request #1087 from mgz-dev/fix-imports-on-svd_merge_lora
kohya-ss Jan 31, 2024
2ca4d0c
Merge pull request #1054 from akx/mps
kohya-ss Jan 31, 2024
a6a2b5a
Fix IPEX support and add XPU device to device_utils
Disty0 Jan 31, 2024
9f0f0d5
Merge pull request #1092 from Disty0/dev_device_support
kohya-ss Feb 1, 2024
5cca1fd
add highvram option and do not clear cache in caching latents
kohya-ss Feb 1, 2024
1567ce1
Enable distributed sample image generation on multi-GPU enviroment (#…
DKnight54 Feb 3, 2024
11aced3
simplify multi-GPU sample generation
kohya-ss Feb 3, 2024
2f9a344
fix typo
kohya-ss Feb 3, 2024
6269682
unificaition of gen scripts for SD and SDXL, work in progress
kohya-ss Feb 3, 2024
bf2de56
fix formatting in resize_lora.py
mgz-dev Feb 4, 2024
1492bcb
add --new_conv_rank option
mgz-dev Feb 4, 2024
e793d77
reduce peak VRAM in sample gen
kohya-ss Feb 4, 2024
5f6bf29
Replace print with logger if they are logs (#905)
shirayu Feb 4, 2024
6279b33
fallback to basic logging if rich is not installed
kohya-ss Feb 4, 2024
efd3b58
Add logging arguments and update logging setup
kohya-ss Feb 4, 2024
74fe045
add comment for get_preferred_device
kohya-ss Feb 8, 2024
9b8ea12
update log initialization without rich
kohya-ss Feb 8, 2024
055f02e
add logging args for training scripts
kohya-ss Feb 8, 2024
5d9e287
make rich to output to stderr instead of stdout
kohya-ss Feb 8, 2024
7202596
log to print tag frequencies
kohya-ss Feb 10, 2024
f897d55
Merge pull request #1113 from kohya-ss/dev_multi_gpu_sample_gen
kohya-ss Feb 11, 2024
75ecb04
Merge branch 'dev' into dev_device_support
kohya-ss Feb 11, 2024
e24d960
add clean_memory_on_device and use it from training
kohya-ss Feb 12, 2024
e579648
fix help for highvram arg
kohya-ss Feb 12, 2024
672851e
Merge branch 'dev' into dev_improve_log
kohya-ss Feb 12, 2024
20ae603
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Feb 12, 2024
35c6053
Merge pull request #1104 from kohya-ss/dev_improve_log
kohya-ss Feb 12, 2024
98f42d3
Merge branch 'dev' into gradual_latent_hires_fix
kohya-ss Feb 12, 2024
c748719
fix indent
kohya-ss Feb 12, 2024
358ca20
Merge branch 'dev' into dev_device_support
kohya-ss Feb 12, 2024
d3745db
add args for logging
kohya-ss Feb 12, 2024
cbe9c5d
supprt deep shink with regional lora, add prompter module
kohya-ss Feb 12, 2024
41d32c0
Merge pull request #1117 from kohya-ss/gradual_latent_hires_fix
kohya-ss Feb 12, 2024
93bed60
fix to work `--console_log_xxx` options
kohya-ss Feb 12, 2024
71ebcc5
update readme and gradual latent doc
kohya-ss Feb 12, 2024
baa0e97
Merge branch 'dev' into dev_device_support
kohya-ss Feb 17, 2024
42f3318
Merge pull request #1116 from kohya-ss/dev_device_support
kohya-ss Feb 17, 2024
75e4a95
update readme
kohya-ss Feb 17, 2024
d1fb480
format by black
kohya-ss Feb 18, 2024
6a9d9be
Fix lora_network_weights
bmaltais Feb 18, 2024
83bf2e0
chore(docker): rewrite Dockerfile
jim60105 Feb 17, 2024
dc94512
ci(docker): Add docker CI
jim60105 Feb 17, 2024
5dc9db6
Revert "ci(docker): Add docker CI"
jim60105 Feb 17, 2024
8330597
chore(docker): Add EXPOSE ports and change final base image to python…
jim60105 Feb 17, 2024
543d12f
chore(docker): fix dependencies for slim image
jim60105 Feb 17, 2024
d7add28
chore(docker): Add label
jim60105 Feb 18, 2024
a6f1ed2
fix dylora create_modules error
tamlog06 Feb 18, 2024
07116dc
Update options.md
mikeboensel Feb 18, 2024
a63c49c
Update options.md
mikeboensel Feb 18, 2024
5b19748
Update options.md
mikeboensel Feb 18, 2024
39e3a4b
Label clarifications
mikeboensel Feb 18, 2024
f71b3cf
Merge pull request #1978 from mikeboensel/patch-1
bmaltais Feb 18, 2024
7a49955
Merge pull request #1979 from mikeboensel/patch-2
bmaltais Feb 18, 2024
6a6c932
Merge pull request #1980 from mikeboensel/patch-3
bmaltais Feb 18, 2024
78e2df1
Merge pull request #1976 from jim60105/master
bmaltais Feb 18, 2024
2d0ed8e
Merge pull request #1981 from mikeboensel/patch-4
bmaltais Feb 18, 2024
86279c8
Merge branch 'dev' into DyLoRA-xl
kohya-ss Feb 24, 2024
488d187
Merge pull request #1126 from tamlog06/DyLoRA-xl
kohya-ss Feb 24, 2024
f413201
fix to work with cpu_count() == 1 closes #1134
kohya-ss Feb 24, 2024
24092e6
update einops to 0.7.0 #1122
kohya-ss Feb 24, 2024
fb9110b
format by black
kohya-ss Feb 24, 2024
0e70360
Merge branch 'dev' into resize_lora-add-rank-for-conv
kohya-ss Feb 24, 2024
738c397
Merge pull request #1102 from mgz-dev/resize_lora-add-rank-for-conv
kohya-ss Feb 24, 2024
52b3799
fix format, add new conv rank to metadata comment
kohya-ss Feb 24, 2024
8b7c142
some log output to print
kohya-ss Feb 24, 2024
81e8af6
fix ipex init
kohya-ss Feb 24, 2024
a21218b
update readme
kohya-ss Feb 24, 2024
e69d341
Merge pull request #1136 from kohya-ss/dev
kohya-ss Feb 24, 2024
a20c2bd
Merge branch 'main' of https://github.com/kohya-ss/sd-scripts into dev
bmaltais Feb 24, 2024
822d94c
Merge branch 'dev' of https://github.com/bmaltais/kohya_ss into dev
bmaltais Feb 24, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
fix dylora create_modules error
  • Loading branch information
tamlog06 committed Feb 18, 2024
commit a6f1ed2e140eb4d4d37c0bb0502a7c0fd0621f5f
31 changes: 28 additions & 3 deletions networks/dylora.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,9 @@
import math
import os
import random
from typing import List, Tuple, Union
from typing import Dict, List, Optional, Tuple, Type, Union
from diffusers import AutoencoderKL
from transformers import CLIPTextModel
import torch
from torch import nn

Expand Down Expand Up @@ -165,7 +167,15 @@ def _load_from_state_dict(self, state_dict, prefix, local_metadata, strict, miss
super()._load_from_state_dict(state_dict, prefix, local_metadata, strict, missing_keys, unexpected_keys, error_msgs)


def create_network(multiplier, network_dim, network_alpha, vae, text_encoder, unet, **kwargs):
def create_network(
multiplier: float,
network_dim: Optional[int],
network_alpha: Optional[float],
vae: AutoencoderKL,
text_encoder: Union[CLIPTextModel, List[CLIPTextModel]],
unet,
**kwargs,
):
if network_dim is None:
network_dim = 4 # default
if network_alpha is None:
Expand All @@ -182,6 +192,7 @@ def create_network(multiplier, network_dim, network_alpha, vae, text_encoder, un
conv_alpha = 1.0
else:
conv_alpha = float(conv_alpha)

if unit is not None:
unit = int(unit)
else:
Expand Down Expand Up @@ -306,8 +317,22 @@ def create_modules(is_unet, root_module: torch.nn.Module, target_replace_modules
lora = module_class(lora_name, child_module, self.multiplier, dim, alpha, unit)
loras.append(lora)
return loras

text_encoders = text_encoder if type(text_encoder) == list else [text_encoder]

self.text_encoder_loras = []
for i, text_encoder in enumerate(text_encoders):
if len(text_encoders) > 1:
index = i + 1
print(f"create LoRA for Text Encoder {index}")
else:
index = None
print(f"create LoRA for Text Encoder")

text_encoder_loras = create_modules(False, text_encoder, DyLoRANetwork.TEXT_ENCODER_TARGET_REPLACE_MODULE)
self.text_encoder_loras.extend(text_encoder_loras)

self.text_encoder_loras = create_modules(False, text_encoder, DyLoRANetwork.TEXT_ENCODER_TARGET_REPLACE_MODULE)
# self.text_encoder_loras = create_modules(False, text_encoder, DyLoRANetwork.TEXT_ENCODER_TARGET_REPLACE_MODULE)
print(f"create LoRA for Text Encoder: {len(self.text_encoder_loras)} modules.")

# extend U-Net target modules if conv2d 3x3 is enabled, or load from weights
Expand Down