Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update training_args.py - addition of self.distributed_state when using XPU #25999

Merged
merged 2 commits into from
Sep 13, 2023

Conversation

Serizao
Copy link
Contributor

@Serizao Serizao commented Sep 5, 2023

addition of self.distributed_state when using XPU

Fixe

In the base code self.distributed_state does not appear to be defined, which causes the script to crash when used on lines 1813 and 1814 in my case.

I therefore propose an update with a definition of this variable when using an XPU

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Missing distributed state so lign 1813-1814 failed because value is undefined
@amyeroberts
Copy link
Collaborator

cc @muellerzr @pacman100

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

Copy link
Contributor

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice catch! We should also change the os.environ to be in front so PartialState picks it up

Co-authored-by: Zach Mueller <muellerzr@gmail.com>
Copy link
Contributor

@muellerzr muellerzr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! LG2M here

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding!

Before merging - could you:

  • Update the PR title to something more descriptive
  • Give a bit more context about the error?

The addition makes sense and is in line with the other logic. However, self.distributed_state is defined on L1782 and the referenced line in the PR description L1813-1814 doesn't seem to be related to this issue. This is mainly for documentation if people come back to this PR.

@amyeroberts amyeroberts changed the title Update training_args.py Update training_args.py - addition of self.distributed_state when using XPU Sep 13, 2023
@amyeroberts amyeroberts merged commit e52f1cb into huggingface:main Sep 13, 2023
@Serizao Serizao deleted the patch-1 branch September 14, 2023 20:05
parambharat pushed a commit to parambharat/transformers that referenced this pull request Sep 26, 2023
…ng XPU (huggingface#25999)

* Update training_args.py

Missing distributed state so lign 1813-1814 failed because value is undefined

* Update training_args.py

Co-authored-by: Zach Mueller <muellerzr@gmail.com>

---------

Co-authored-by: Zach Mueller <muellerzr@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants