-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add TCD Scheduler #7174
add TCD Scheduler #7174
Conversation
hi! |
Yes, it suits with SD and XL. Currently we only release SDXL LoRA. Here is a demo Space: https://huggingface.co/spaces/h1t/TCD ,
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
is this ready for a review? |
Yup. at least looks good for me now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey thank you for the PR!
looks great and I think we can merge this soon
I left a few nits,
additionally, we introduced begin_index
in PR very recently adcbe67#diff-98c53466d792a8faefb0045d29513a1c4bcd6b8144b8c06f66b697d4117039d7 - can you apply the same changes here? this will ensure your scheduler working properly with img2img pipelines
raise ValueError("Can only pass one of `num_inference_steps` or `custom_timesteps`.") | ||
|
||
# 1. Calculate the TCD original training/distillation timestep schedule. | ||
original_steps = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems like we are handling the case when original_steps = None
but this code here will default original_steps
to config.original_inference_steps
when user pass the None
value - is this intended?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is an option for setting the range of t from a) ~ [0, 999] or b)~[9, 19, ... 999] whenconfig.original_inference_steps=50.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, got it
so I think there might be an error in the below code then, if user pass original_inference_steps = None
, orignal_steps
will be set to config.original_inference_steps
, so orignal_steps
won't be None and the else part won't get evaluated
if original_steps is not None:
if original_steps > self.config.num_train_timesteps:
raise ValueError(
f"`original_steps`: {original_steps} cannot be larger than `self.config.train_timesteps`:"
f" {self.config.num_train_timesteps} as the unet model trained with this scheduler can only handle"
f" maximal {self.config.num_train_timesteps} timesteps."
)
# TCD Timesteps Setting
# The skipping step parameter k from the paper.
k = self.config.num_train_timesteps // original_steps
# TCD Training/Distillation Steps Schedule
tcd_origin_timesteps = np.asarray(list(range(1, int(original_steps * strength) + 1))) * k - 1
else:
tcd_origin_timesteps = np.asarray(list(range(0, int(self.config.num_train_timesteps * strength))))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
left one question
let me know, if it is not an error
PR looks ready to merge for me. thanks for the great work!
original_inference_steps if original_inference_steps is not None else self.config.original_inference_steps | ||
) | ||
|
||
if original_steps is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if original_steps is not None: | |
if original_inference_steps is not None: |
maybe this? not sure..
see https://github.com/huggingface/diffusers/pull/7174/files#r1511499565
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
else
should only be evaluated with customised training steps or some special situation. I think following would be better:
# default option
if original_inference_steps is None:
tcd_origin_timesteps align with discrete inference steps
# customised
else:
tcd_origin_timesteps truly randomised
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice!
I think there also needs to a guide in our docs for the users to know how to best use TCD just like we have https://huggingface.co/docs/diffusers/main/en/using-diffusers/inference_with_lcm_lora.
@yiyixuxu WDYT?
sure! |
let us know when it's ready to merge
what @sayakpaul suggested here is a great way to help promote this scheduler, feel free to add a guide either in this PR or a separate one |
I think it is ready to merge for TCD scheduler now. I would open a new PR for the guideline. |
What does this PR do?
Add a new scheduler for trajectory consistency distillation: Paper and Official Implement.
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.