Core routines

Construct model

Main.NLPSaUT.build_model!Method
build_model!(model::JuMP.Model, f_fitness::Function, nx::Int, nh::Int, ng::Int, lx::Vector, ux::Vector, x0::Vector=nothing)

Extend model for NLP problem via memoized fitness function.

Arguments:

  • model::JuMP.Model: model to append objective and constraints
  • f_fitness::Function: fitness function, returning [f, h, g]
  • nx::Int: number of decision variables
  • nh::Int: number of equality constraints
  • ng::Int: number of inequality constraints
  • lx::Vector: lower bounds on decision variables
  • ux::Vector: upper bounds on decision variables
  • x0::Vector: initial guess
  • auto_diff::Bool: whether to use automatic differentiation
  • order::Int: order of FiniteDifferences, minimum is 2
  • fd_type::String: finite-difference method, "forward", "backward", or "central"
source
Main.NLPSaUT.build_modelMethod
build_model(f_fitness::Function, nx::Int, nh::Int, ng::Int, lx::Vector, ux::Vector, x0::Vector=nothing, fd_type::Function=nothing, order::Int=2)

Build model for NLP problem with memoized fitness function.

Arguments:

  • optimizer: optimizer to use with the model
  • f_fitness::Function: fitness function, returning [f, h, g]
  • nx::Int: number of decision variables
  • nh::Int: number of equality constraints
  • ng::Int: number of inequality constraints
  • lx::Vector: lower bounds on decision variables
  • ux::Vector: upper bounds on decision variables
  • x0::Vector: initial guess
  • auto_diff::Bool: whether to use automatic differentiation
  • order::Int: order of FiniteDifferences, minimum is 2
  • fd_type::String: finite-difference method, "forward", "backward", or "central"
source

Memoization

Main.NLPSaUT.memoize_fitnessMethod
memoize_fitness(f_fitness::Function, n_outputs::Int)

Memoize fitness function. Because foo_i is auto-differentiated with ForwardDiff, our cache needs to work when x is a Float64 and a ForwardDiff.Dual.

See: https://jump.dev/JuMP.jl/stable/tutorials/nonlinear/tipsandtricks/#Memoization

source
Main.NLPSaUT.memoize_fitness_gradientFunction
memoize_fitness_gradient(f_fitness::Function, nfitness::Int, fd_type::Function, order::Int=2)

Create memoized gradient computation with method specified by fd_type - fd_type = "forward" use FiniteDifferences.forward_fdm() - fd_type = "central" use FiniteDifferences.central_fdm() - fd_type = "backward" use FiniteDifferences.backward_fdm()

source