NLPSaUT
The NLPSaUT
module constructs a JuMP
model
for a generic nonlinear program (NLP). The expected use case is solving a differentiable (either analytically or numerically) nonconvex NLP with gradient-based algorithms such as Ipopt or SNOPT.
The user is expected to provide a "fitness function" (pygmo-style), which evaluates the objective, equality, and inequality constraints. Derivatives of f_fitness
is taken using ForwardDiff.jl
(which is the default JuMP
behavior according to its docs); as such, f_fitness
should be written in a way that is compatiable to ForwardDiff.jl
(read here as to why it is ForwardDiff
, not ReverseDiff
). For reference, here's the JuMP docs page on common mistakes when using ForwardDiff.jl
.
The model
constructed by NLPSaUT
utilizes memoization
to economize on the fitness evaluation (see JuMP Tips and tricks on NLP).
Quick start
git clone
this repository- start julia-repl
- activate & instantiate package (first time)
pkg> activate .
julia> using Pkg # first time only
julia> Pkg.instantiate() # first time only
Tests
(NLPSaUT) pkg> test