-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IPAdapterTesterMixin #6862
IPAdapterTesterMixin #6862
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
thanks so much for working on this!!! 😇😇😇 |
Yes that makes sense since Btw, am I right in assuming that IP-Adapter-Plus tests should also be added here? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey thanks! looks really good!
let's wait a little bit for #6868 to merge and we can finish it up!
I think we should add to all pipeline that support ip-adapter; but we should only use IP adapter image embedding directly, so the code addition to the fast tests is actually minimum; I think you just have to append the mixin to the test that's all :) also if we do it that way, we don't test the part of code that's redundant ( |
Co-authored-by: YiYi Xu <yixu310@gmail.com>
Co-authored-by: YiYi Xu <yixu310@gmail.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
love this! thank you!
look, it already caught a bunch of test failures!!! 😅 Not all of the pipelines need to have ip-adapter support IMO. So if the pipeline isn't very much used, e.g. stable diffusion safe, and the test failure is because the ip-adapter isn't correctly implemented. Let me know! we can probably remove it |
Thanks for the Good PR label 😆 Just adding the IPAdapterTesterMixin to the class defintion was quite convenient! Also quite cool to see the error logs 👀 Will fix in a bit I agree with removing the tests from the not-so-used pipelines that are not regularly discussed about in issues. I will remove them from what I think are lesser used, but since you have access to make changes to my branch, please feel free to change it to your liking. |
tests/pipelines/stable_diffusion_sag/test_stable_diffusion_sag.py
Outdated
Show resolved
Hide resolved
tests/pipelines/latent_consistency_models/test_latent_consistency_models.py
Show resolved
Hide resolved
Pipelines that have the error that looks something like: FAILED tests/pipelines/controlnet/test_controlnet_img2img.py::ControlNetImg2ImgPipelineFastTests::test_ip_adapter - TypeError: argument of type 'NoneType' is not iterable
FAILED tests/pipelines/stable_diffusion/test_stable_diffusion_img2img.py::StableDiffusionImg2ImgPipelineFastTests::test_ip_adapter - TypeError: argument of type 'NoneType' is not iterable is because the IP Adapter implementation is added_cond_kwargs = {"image_embeds": image_embeds} if ip_adapter_image is not None else None The check should include both ip_adapter_image and ip_adapter_image_embeds as done in some pipelines of #6868 but looks like a few spots were missed. Looks like this mixin will help make things quite consistent! |
src/diffusers/pipelines/latent_consistency_models/pipeline_latent_consistency_text2img.py
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you!
@yiyixuxu I'm not sure why these tests fail. I think the results are slightly different based on device because they do pass on my system with a difference greater than 0.01. Is lowering the threshold the ideal solution or what do you have in mind? Maybe a threshold with the differences accumulated could also be used. Regarding ControlNetSimpleInpaintPipelineFastTests, I'm not sure why the outputs are the same. |
# forward pass with single ip adapter, but with scale of adapter weights | ||
inputs = self._modify_inputs_for_ip_adapter_test(self.get_dummy_inputs(torch_device)) | ||
inputs["ip_adapter_image_embeds"] = [self._get_dummy_image_embeds(cross_attention_dim)] | ||
pipe.set_ip_adapter_scale(1.0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe increase this value to 100 lol
@a-r-r-o-w |
Thanks @yiyixuxu. I believe all tests should pass now and that this is ready for a merge. |
Hi @a-r-r-o-w |
inputs["output_type"] = "np" | ||
inputs["return_dict"] = False | ||
if "image" in inputs.keys(): | ||
inputs["num_inference_steps"] = 4 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a question. Why increase this based on the whether image
is provided? Because of strength
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is correct. The tests fail for img2img tests that have num_inference_steps set to 2 or lower and strength greater than 5. I think if it was:
if "image" in inputs().keys() and "strength" in inputs().keys():
...
it would make more sense.
inputs["num_inference_steps"] = 4 | ||
return inputs | ||
|
||
def test_ip_adapter(self, expected_max_diff: float = 1e-4): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we break this up into individually named tests? It is hard to debug when an intermediate assert causes the entire test to fail.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've split it into two tests for the time being but note that if we have K classes that derive from this mixin, we now have K additional inferences running as compared to before. The time to execute all tests doesn't increase too much though.
Yeah, I think I've been having troubles on getting it to work. I'm not sure why the results vary :/ |
d618540
to
6744cac
Compare
@yiyixuxu Apologies for the delay. With the last commit, all tests pass for me locally and I hope this is completed 🥲 |
thank you:) |
What does this PR do?
Adds IPAdapterTesterMixin to ensure pipelines supporting it are not broken with newer changes.
Fixes #6829.
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
@yiyixuxu @sayakpaul