Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Set minuit strategy automatically for gradient/non-gradient mode #1183

Merged
merged 13 commits into from
Nov 19, 2020

Conversation

kratsg
Copy link
Contributor

@kratsg kratsg commented Nov 18, 2020

Pull Request Description

Teach pyhf to set the minuit strategy correctly depending on whether user provides gradient or not. Additionally allow for this to be configurable via strategy kwarg.

Resolves #1172.

ReadTheDocs build: https://pyhf.readthedocs.io/en/feat-userprovidedgradientstominuit/_generated/pyhf.optimize.opt_minuit.minuit_optimizer.html

Checklist Before Requesting Reviewer

  • Tests are passing
  • "WIP" removed from the title of the pull request
  • Selected an Assignee for the PR to be responsible for the log summary

Before Merging

For the PR Assignees:

  • Summarize commit messages into a comprehensive review of the PR
* Set MINUIT strategy to 0/1 automatically if using gradients/no-gradient
* Allow for MINUIT strategy to be configurable
* Add test for strategy selection

@kratsg kratsg added API Changes the public API feat/enhancement New feature or request optimization labels Nov 18, 2020
@kratsg kratsg self-assigned this Nov 18, 2020
@kratsg kratsg changed the title feat: Set minuit strategy automatically for gradient/non-gradient mode; configurable feat: Set minuit strategy automatically for gradient/non-gradient mode Nov 18, 2020
@matthewfeickert matthewfeickert added the tests pytest label Nov 18, 2020
@codecov
Copy link

codecov bot commented Nov 18, 2020

Codecov Report

Merging #1183 (fef57e0) into master (937d716) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1183   +/-   ##
=======================================
  Coverage   97.43%   97.43%           
=======================================
  Files          63       63           
  Lines        3665     3669    +4     
  Branches      522      522           
=======================================
+ Hits         3571     3575    +4     
  Misses         55       55           
  Partials       39       39           
Flag Coverage Δ
unittests 97.43% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/pyhf/optimize/opt_minuit.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 937d716...fef57e0. Read the comment docs.

Copy link
Member

@matthewfeickert matthewfeickert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks @kratsg.

Copy link
Member

@matthewfeickert matthewfeickert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Woops.

ImportError while loading conftest '/home/runner/work/pyhf/pyhf/tests/conftest.py'.
tests/conftest.py:87: in <module>
    pyhf.optimize.minuit_optimizer(),
src/pyhf/optimize/opt_minuit.py:35: in __init__
    self.strategy = kwargs.pop('strategy', None)
E   AttributeError: 'minuit_optimizer' object has no attribute 'strategy'

I'll check again in a bit.

Copy link
Member

@matthewfeickert matthewfeickert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The CI is still running, but I'll approve this so that it can be merged in once everything in CI hits green.

@kratsg kratsg merged commit 985aafa into master Nov 19, 2020
@kratsg kratsg deleted the feat/userprovidedGradientsToMinuit branch November 19, 2020 01:27
Aspyona pushed a commit to Aspyona/pyhf that referenced this pull request Nov 22, 2020
scikit-hep#1183)

* Set MINUIT strategy to 0/1 automatically if using gradients/no-gradient
* Allow for MINUIT strategy to be configurable
* Add test for strategy selection
matthewfeickert pushed a commit that referenced this pull request Aug 15, 2023
* Properly guard the `strategy` option for the Minuit optimizer in the case that
  `strategy` option of 0 is used.
  Without checking `self.strategy is not None` the `if self.strategy` for
  self.strategy==0 evaluates to truthiness of False, giving the wrong value of
  `not do_grad` for do_grad of False.
   - Amends PR #1183
* Use the corrected behavior to improve testing of default in test_minuit_strategy_global.
* Add Daniel Werner to contributors list.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Changes the public API feat/enhancement New feature or request optimization tests pytest
Projects
None yet
Development

Successfully merging this pull request may close these issues.

use strategy 0 for user-provided gradients in minuit
2 participants