Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve the precision of our integration tests #914

Closed
patrickvonplaten opened this issue Oct 19, 2022 · 7 comments · Fixed by #1052
Closed

Improve the precision of our integration tests #914

patrickvonplaten opened this issue Oct 19, 2022 · 7 comments · Fixed by #1052
Assignees
Labels
stale Issues that haven't received updates

Comments

@patrickvonplaten
Copy link
Contributor

We currently have a rather low precision when testing our pipeline due to due reasons.

    • Our reference is an image and not a numpy array. This means that when we created our reference image we lost float precision which is unnecessary
    • We only test for .max() < 1e-2 . IMO we should test for .max() < 1e-4 with the numpy arrays. In my experiements across multiple devices I have not seen differences bigger than .max() < 1e-4 when using full precision.

IMO this could have also prevented: #902

@Lewington-pitsos
Copy link
Contributor

I'm going to start work on this right now

@Lewington-pitsos
Copy link
Contributor

@patrickvonplaten after some research I think I understand the issue.

The images are currently stored in a low-precision format (e.g. png) which prevents us from testing with any greater precision than 1e-2. Even if we convert the image into a numpy array this will not help since the image itself is missing precision.

What we need to do is store a numpy representation of the image, say somewhere inside https://huggingface.co/datasets which we can then download and use for comparison.

The way we could do this is by generating the test output, saving that output as a numpy image and uploading it.

This is what I plan to do tomorrow.

This stack overflow thread was very helpful in understanding the issue.

@patrickvonplaten
Copy link
Contributor Author

Hey @Lewington-pitsos,

That's a very good observation - we came to the same conclusion here: #937 :-)
Do you have an account on the Hugging Face Hub? Would you like to upload the numpy images to a dataset on the Hub maybe? :-) This would be super useful!

@patrickvonplaten patrickvonplaten self-assigned this Oct 26, 2022
@Lewington-pitsos
Copy link
Contributor

Hey, yes I am currently working on this in fact!

@Lewington-pitsos
Copy link
Contributor

@patrickvonplaten I made a PR adding the files: https://huggingface.co/datasets/hf-internal-testing/diffusers-images/discussions/2

@patrickvonplaten
Copy link
Contributor Author

@github-actions github-actions bot added the stale Issues that haven't received updates label Nov 26, 2022
@huggingface huggingface deleted a comment from github-actions bot Nov 30, 2022
@patrickvonplaten patrickvonplaten removed the stale Issues that haven't received updates label Nov 30, 2022
@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@github-actions github-actions bot added the stale Issues that haven't received updates label Dec 24, 2022
@github-actions github-actions bot closed this as completed Jan 2, 2023
PhaneeshB pushed a commit to nod-ai/diffusers that referenced this issue Mar 1, 2023
…ingface#914)

* [SD] Modify the flags to use --iree-preprocessing-pass-pipeline

* Fix flags in sd_annotation
PhaneeshB pushed a commit to nod-ai/diffusers that referenced this issue Mar 1, 2023
* Revert "[SD] Modify the flags to use --iree-preprocessing-pass-pipeline (huggingface#914)"

This reverts commit a783c08.

* Revert "Fix iree flags due to the change in shark-runtime (huggingface#944)"

This reverts commit 1d38d49.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale Issues that haven't received updates
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants