Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: fix xpu skip condition for test_model_parallel_beam_search #35742

Merged
merged 1 commit into from
Jan 17, 2025

Conversation

dvrogozh
Copy link
Contributor

@dvrogozh dvrogozh commented Jan 17, 2025

return unittest.skip() used in the test_model_parallel_beam_search in skip condition for xpu did not actually mark test to be skipped running under pytest:

  • 148 passed, 1 skipped

Other tests use self.skipTest(). Reusing this approach and moving the condition outside the loop (since it does not depend on it) allows to skip for xpu correctly:

  • 148 skipped

Secondly, device_map="auto" is now implemented for XPU for IPEX>=2.5 and torch>=2.6, so we can now enable these tests for XPU for new IPEX/torch versions.

Fixes: 1ea3ad1 ("[tests] use torch_device instead of auto for model testing (#29531)")
CC: @faaany @ydshieh

Copy link
Collaborator

@ydshieh ydshieh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you 🤗 !

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
`return unittest.skip()` used in the `test_model_parallel_beam_search` in
skip condition for xpu did not actually mark test to be skipped running
under pytest:
* 148 passed, 1 skipped

Other tests use `self.skipTest()`. Reusing this approach and moving the
condition outside the loop (since it does not depend on it) allows to skip
for xpu correctly:
* 148 skipped

Secondly, `device_map="auto"` is now implemented for XPU for IPEX>=2.5 and
torch>=2.6, so we can now enable these tests for XPU for new IPEX/torch
versions.

Fixes: 1ea3ad1 ("[tests] use `torch_device` instead of `auto` for model testing (huggingface#29531)")
Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
@ydshieh ydshieh merged commit 7d4b3dd into huggingface:main Jan 17, 2025
23 checks passed
bursteratom pushed a commit to bursteratom/transformers that referenced this pull request Jan 31, 2025
…ngface#35742)

`return unittest.skip()` used in the `test_model_parallel_beam_search` in
skip condition for xpu did not actually mark test to be skipped running
under pytest:
* 148 passed, 1 skipped

Other tests use `self.skipTest()`. Reusing this approach and moving the
condition outside the loop (since it does not depend on it) allows to skip
for xpu correctly:
* 148 skipped

Secondly, `device_map="auto"` is now implemented for XPU for IPEX>=2.5 and
torch>=2.6, so we can now enable these tests for XPU for new IPEX/torch
versions.

Fixes: 1ea3ad1 ("[tests] use `torch_device` instead of `auto` for model testing (huggingface#29531)")

Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
elvircrn pushed a commit to elvircrn/transformers that referenced this pull request Feb 13, 2025
…ngface#35742)

`return unittest.skip()` used in the `test_model_parallel_beam_search` in
skip condition for xpu did not actually mark test to be skipped running
under pytest:
* 148 passed, 1 skipped

Other tests use `self.skipTest()`. Reusing this approach and moving the
condition outside the loop (since it does not depend on it) allows to skip
for xpu correctly:
* 148 skipped

Secondly, `device_map="auto"` is now implemented for XPU for IPEX>=2.5 and
torch>=2.6, so we can now enable these tests for XPU for new IPEX/torch
versions.

Fixes: 1ea3ad1 ("[tests] use `torch_device` instead of `auto` for model testing (huggingface#29531)")

Signed-off-by: Dmitry Rogozhkin <dmitry.v.rogozhkin@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants