-
Notifications
You must be signed in to change notification settings - Fork 85
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Set minuit strategy automatically for gradient/non-gradient mode #1183
Conversation
for more information, see https://pre-commit.ci
Codecov Report
@@ Coverage Diff @@
## master #1183 +/- ##
=======================================
Coverage 97.43% 97.43%
=======================================
Files 63 63
Lines 3665 3669 +4
Branches 522 522
=======================================
+ Hits 3571 3575 +4
Misses 55 55
Partials 39 39
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
Co-authored-by: Matthew Feickert <matthew.feickert@cern.ch>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks @kratsg.
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Woops.
ImportError while loading conftest '/home/runner/work/pyhf/pyhf/tests/conftest.py'.
tests/conftest.py:87: in <module>
pyhf.optimize.minuit_optimizer(),
src/pyhf/optimize/opt_minuit.py:35: in __init__
self.strategy = kwargs.pop('strategy', None)
E AttributeError: 'minuit_optimizer' object has no attribute 'strategy'
I'll check again in a bit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The CI is still running, but I'll approve this so that it can be merged in once everything in CI hits green.
scikit-hep#1183) * Set MINUIT strategy to 0/1 automatically if using gradients/no-gradient * Allow for MINUIT strategy to be configurable * Add test for strategy selection
* Properly guard the `strategy` option for the Minuit optimizer in the case that `strategy` option of 0 is used. Without checking `self.strategy is not None` the `if self.strategy` for self.strategy==0 evaluates to truthiness of False, giving the wrong value of `not do_grad` for do_grad of False. - Amends PR #1183 * Use the corrected behavior to improve testing of default in test_minuit_strategy_global. * Add Daniel Werner to contributors list.
Pull Request Description
Teach pyhf to set the minuit strategy correctly depending on whether user provides gradient or not. Additionally allow for this to be configurable via
strategy
kwarg.Resolves #1172.
ReadTheDocs build: https://pyhf.readthedocs.io/en/feat-userprovidedgradientstominuit/_generated/pyhf.optimize.opt_minuit.minuit_optimizer.html
Checklist Before Requesting Reviewer
Before Merging
For the PR Assignees: