Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix max_epochs setup in basic example #1105

Merged

Conversation

oplatek
Copy link
Contributor

@oplatek oplatek commented Mar 9, 2020

tested only for the CPU version

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

What does this PR do?

Fixes the issue that the max_epochs parameter is ignored.

Tested via running it locally only

python cpu_template.py --epochs 1

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

🌞

@oplatek oplatek changed the title fix max_epochs setup in basic example WIP: fix max_epochs setup in basic example Mar 9, 2020
@oplatek
Copy link
Contributor Author

oplatek commented Mar 9, 2020

Also, consider removing the argparse argument

parser.add_argument('--optimizer_name', default='adam', type=str)

because it is not used in the code.

@oplatek oplatek changed the title WIP: fix max_epochs setup in basic example fix max_epochs setup in basic example Mar 9, 2020
@Borda Borda added bug Something isn't working and removed bug Something isn't working labels Mar 9, 2020
@Borda Borda self-requested a review March 9, 2020 23:55
@Borda
Copy link
Member

Borda commented Mar 11, 2020

hey there, we have added GPU CI test, so could we kindly ask to rebase/merge master which will trigger these tests so we do not need to test it manually... Thx for your understanding 🤖

tested only for the CPU version
@oplatek oplatek force-pushed the fix-basic-example-max-epoch-setup branch from 722ed40 to 3228a64 Compare March 12, 2020 08:07
@oplatek
Copy link
Contributor Author

oplatek commented Mar 12, 2020

@Borda I just force pushed the branch where I rebased the upstream/master (upstream == PL/PL master)

Copy link
Member

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

@Borda Borda added ready PRs ready to be merged bug Something isn't working labels Mar 12, 2020
@Borda Borda added this to the 0.7.2 milestone Mar 12, 2020
@williamFalcon williamFalcon merged commit 1383f64 into Lightning-AI:master Mar 12, 2020
tullie pushed a commit to tullie/pytorch-lightning that referenced this pull request Apr 3, 2020
@Borda Borda modified the milestones: v0.7., v0.7.x Apr 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants