Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NLP parser assumes ::Model and violates MethodError principle #2402

Closed
odow opened this issue Dec 12, 2020 · 1 comment · Fixed by #2406
Closed

NLP parser assumes ::Model and violates MethodError principle #2402

odow opened this issue Dec 12, 2020 · 1 comment · Fixed by #2406

Comments

@odow
Copy link
Member

odow commented Dec 12, 2020

_process_NL_expr gets called from the @NL macros, and doesn't have any type requirements:

function _process_NL_expr(model, ex)

but it eventually calls a _process_NL_expr_runtime which are restricted to ::Model.
function _parse_NL_expr_runtime(m::Model, x::Real, tape, parent, values)

Thus, someone (e.g., BilevelJuMP, x-ref: joaquimg/BilevelJuMP.jl#57) using this machinery with an <:AbstractModel encounters a MethodError:
https://discourse.julialang.org/t/methoderror-no-method-matching-parse-nl-expr-runtime-bilevel-problem/51681

We could either add a type-restriction to _process_NL_expr and have a fallback with an informative error message, or we could probably relax the type-restrictions, and see what happens. It probably means relaxing this as well to AbstractVariableRef:

JuMP.jl/src/parse_nlp.jl

Lines 187 to 188 in 1e6b5d8

function _parse_NL_expr_runtime(m::Model, x::VariableRef, tape, parent, values)
if owner_model(x) !== m

cc @pulsipher who might be messing with this stuff

@pulsipher
Copy link
Contributor

pulsipher commented Dec 12, 2020

This has indeed come up with InfiniteOpt.jl as I've looked into how to enable NLP expressions (see infiniteopt/InfiniteOpt.jl#16). At one point, I did make a version of _parse_NL_expr_runtime that could work with InfiniteModels and my variable types, however I ran into a number of other problems further down the NLP pipeline.

Thus, I came to the conclusion that the JuMP NLP interface in its current form is essentially non-extendable for use with AbstractModels and AbstractVariableRefs without a significant rewrite such as what is proposed in jump-dev/MathOptInterface.jl#846. Since this is a long way out, I am actually looking into developing a native NLP expression tree abstraction to InfiniteOpt.jl to enable NLP expressions/constraints.

For what it is worth, my recommendation here would be to add a meaningful fallback method and reserve any additional effort for use toward jump-dev/MathOptInterface.jl#846 once the time comes to work on it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging a pull request may close this issue.

2 participants