-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for MOI.ScalarNonlinearFunction #346
Conversation
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## master #346 +/- ##
==========================================
+ Coverage 93.58% 94.02% +0.43%
==========================================
Files 4 4
Lines 858 921 +63
==========================================
+ Hits 803 866 +63
Misses 55 55
☔ View full report in Codecov by Sentry. |
src/MOI_wrapper.jl
Outdated
@@ -5,6 +5,16 @@ | |||
|
|||
include("utils.jl") | |||
|
|||
const _PARAMETER_OFFSET = 0x00f0000000000000 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a messy piece of logic to copy in each NLP solver. It's fine for now, but can this eventually go into MOI.Nonlinear.Model
or some other abstraction layer?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes... it's unfortunately messy. I haven't thought about ways to abstract this because I think we'd need to see a couple of implementations first.
if model.nlp_model !== nothing | ||
backend = MOI.Nonlinear.SparseReverseMode() | ||
evaluator = MOI.Nonlinear.Evaluator(model.nlp_model, backend, vars) | ||
model.nlp_data = MOI.NLPBlockData(evaluator) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we now delete the code to set nlp_data
as an attribute?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah. So we didn't discuss the deprecation plan in today's call. My suggestion is that we leave all of the current infrastructure in-place. So if you use @NLconstraint
it'll go via NLPBlock
, and if you use @constraint
it'll go via ScalarNonlinearFunction
.
On the documentation front in JuMP, we'd just delete the mentions of @NL
macros, but perhaps leave them in the reference docstrings with a compat annotationn/warning.
779c3ba
to
2e54f8f
Compare
x-ref: jump-dev/MathOptInterface.jl#2059
The most-basic example is working: