Skip to content

Commit

Permalink
Improve NLP documentation about ForwardDiff (#2000)
Browse files Browse the repository at this point in the history
  • Loading branch information
odow authored Jun 29, 2019
1 parent 8e7b117 commit ea60e4f
Showing 1 changed file with 15 additions and 3 deletions.
18 changes: 15 additions & 3 deletions docs/src/nlp.md
Original file line number Diff line number Diff line change
Expand Up @@ -193,9 +193,21 @@ JuMP uses [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) to
perform automatic differentiation; see the ForwardDiff.jl
[documentation](http://www.juliadiff.org/ForwardDiff.jl/v0.10.2/user/limitations.html)
for a description of how to write a function suitable for automatic
differentiation. The general guideline is to write code that is generic with
respect to the number type; don't assume that the input to the function is
`Float64`. To register a user-defined function with derivatives computed by
differentiation.

!!! note
If you see method errors with `ForwardDiff.Duals`, see the guidelines at
[ForwardDiff.jl](http://www.juliadiff.org/ForwardDiff.jl/release-0.10/user/limitations.html).
The most common error is that your user-defined function is not generic with
respect to the number type, i.e., don't assume that the input to the function
is `Float64`.
```julia
f(x::Float64) = 2 * x # This will not work.
f(x::Real) = 2 * x # This is good.
f(x) = 2 * x # This is also good.
```

To register a user-defined function with derivatives computed by
automatic differentiation, use the `register` method as in the following
example:

Expand Down

0 comments on commit ea60e4f

Please sign in to comment.