KNITRO.jl Documentation

Transcription

KNITRO.jl Documentation
KNITRO.jl Documentation
Release 0.0
Ng Yee Sian
November 27, 2014
Contents
1
Contents
1.1 Installation Guide . . . . . . . . . . . .
1.2 Example . . . . . . . . . . . . . . . . .
1.3 Creating and Solving Problems . . . . .
1.4 Changing and reading solver parameters .
1.5 Callbacks . . . . . . . . . . . . . . . . .
1.6 JuMP interface . . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
3
3
4
5
6
7
8
i
ii
KNITRO.jl Documentation, Release 0.0
The KNITRO.jl package provides an interface for using the KNITRO solver from the Julia language. You cannot use
KNITRO.jl without having purchased and installed a copy of KNITRO from Ziena Optimization. This package is
available free of charge and in no way replaces or alters any functionality of Ziena’s KNITRO solver.
KNITRO functionality is extensive, so coverage is incomplete, but the basic functionality for solving linear, nonlinear,
and mixed-integer programs is provided.
Contents
1
KNITRO.jl Documentation, Release 0.0
2
Contents
CHAPTER 1
Contents
1.1 Installation Guide
1. First, you must obtain a copy of the KNITRO software and a license; trial versions and academic licenses are
available here.
2. Once KNITRO is installed on your machine, point the LD_LIBRARY_PATH
DYLD_LIBRARY_PATH (OS X) variable to the KNITRO library by adding, e.g.,
(Linux)
or
export LD_LIBRARY_PATH="$HOME/knitro-9.0.1-z/lib:$LD_LIBRARY_PATH"
or
export DYLD_LIBRARY_PATH="$HOME/knitro-9.0.1-z/lib:$DYLD_LIBRARY_PATH"
to your start-up file (e.g. .bash_profile).
3. At the Julia prompt, run
julia> Pkg.add("KNITRO")
4. Test that KNITRO works by runnning
julia> Pkg.test("KNITRO")
1.1.1 Setting up KNITRO on Windows
Note that currently only 64-bit Windows is supported. That is, you must use 64-bit Julia and install the Win64 version
of KNITRO.
1. First, you must obtain a copy of the KNITRO software and a license; trial versions and academic licenses are
available here.
2. Once KNITRO is installed on your machine, add the directory containing knitro.dll to the PATH environment variable, as described in the KNITRO documentation.
3. At the Julia prompt, run
julia> Pkg.add("KNITRO")
4. Test that KNITRO works by runnning
julia> Pkg.test("KNITRO")
3
KNITRO.jl Documentation, Release 0.0
1.2 Example
We begin with an example to motivate the various interfaces. Here is what that problem looks like in Julia with the
KNITRO.jl interface:
using KNITRO
using Base.Test
#
#
#
#
#
#
#
#
#
#
#
min
9 - 8x1 - 6x2 - 4x3
+ 2(x1^2) + 2(x2^2) + (x3^2) + 2(x1*x2) + 2(x1*x3)
subject to c[0]: x1 + x2 + 2x3 <= 3
x1 >= 0
x2 >= 0
x3 >= 0
initpt (0.5, 0.5, 0.5)
Solution is x1=4/3, x2=7/9, x3=4/9, lambda=2/9
(f* = 1/9)
The problem comes from Hock and Schittkowski, HS35.
function eval_f(x::Vector{Float64})
linear_terms = 9.0 - 8.0*x[1] - 6.0*x[2] - 4.0*x[3]
quad_terms = 2.0*x[1]^2 + 2.0*x[2]^2 + x[3]^2 + 2.0*x[1]*x[2] + 2.0*x[1]*x[3]
return linear_terms + quad_terms
end
function eval_g(x::Vector{Float64}, cons::Vector{Float64})
cons[1] = x[1] + x[2] + 2.0*x[3]
end
function eval_grad_f(x::Vector{Float64}, grad::Vector{Float64})
grad[1] = -8.0 + 4.0*x[1] + 2.0*x[2] + 2.0*x[3]
grad[2] = -6.0 + 2.0*x[1] + 4.0*x[2]
grad[3] = -4.0 + 2.0*x[1]
+ 2.0*x[3]
end
function
jac[1]
jac[2]
jac[3]
end
eval_jac_g(x::Vector{Float64}, jac::Vector{Float64})
= 1.0
= 1.0
= 2.0
function eval_h(x::Vector{Float64}, lambda::Vector{Float64},
sigma::Float64, hess::Vector{Float64})
hess[1] = sigma*4.0
hess[2] = sigma*2.0
hess[3] = sigma*2.0
hess[4] = sigma*4.0
hess[5] = sigma*2.0
end
function eval_hv(x::Vector{Float64}, lambda::Vector{Float64},
sigma::Float64, hv::Vector{Float64})
hv[1] = sigma*4.0*hv[1] + sigma*2.0*hv[2] + sigma*2.0*hv[3]
hv[2] = sigma*2.0*hv[1] + sigma*4.0*hv[2]
hv[3] = sigma*2.0*hv[1]
+ sigma*2.0*hv[3]
end
4
Chapter 1. Contents
KNITRO.jl Documentation, Release 0.0
objGoal = KTR_OBJGOAL_MINIMIZE
objType = KTR_OBJTYPE_QUADRATIC
n = 3
x_L = zeros(n)
x_U = [KTR_INFBOUND,KTR_INFBOUND,KTR_INFBOUND]
m = 1
c_Type = [KTR_CONTYPE_LINEAR]
c_L = [-KTR_INFBOUND]
c_U = [3.0]
jac_con = Int32[0,0,0]
jac_var = Int32[0,1,2]
hess_row = Int32[0,0,0,1,2]
hess_col = Int32[0,1,2,1,2]
x
lambda
obj
= [0.5,0.5,0.5]
= zeros(n+m)
= [0.0]
kp = createProblem()
loadOptionsFile(kp, "knitro.opt")
initializeProblem(kp, objGoal, objType, x_L, x_U, c_Type, c_L, c_U,
jac_var, jac_con, hess_row, hess_col)
setCallbacks(kp, eval_f, eval_g, eval_grad_f, eval_jac_g, eval_h, eval_hv)
solveProblem(kp)
As you can see, the code mirrors the C interface fairly closely, with some C-specific features abstracted such as
replacing the various callback-adding functions with one setCallbacks method.
1.3 Creating and Solving Problems
The problem is solved by calling solveProblem. Applications must provide a means of evaluating the nonlinear
objective, constraints, first derivatives, and (optionally) second derivatives. (First derivatives are also optional, but
highly recommended.)
1.3.1 Typical Setup
The typical calling sequence is:
kp = createProblem()
setOption(kp, ...) (set any number of parameters)
initializeProblem(kp, ...)
setCallbacks(kp, ...)
solveProblem(kp) (a single call, or a reverse communications loop)
1.3.2 Restarting the Problem
Calling sequence if the same problem is to be solved again, with different parameters or a different start point (see
examples/hs035_restart.jl):
1.3. Creating and Solving Problems
5
KNITRO.jl Documentation, Release 0.0
kp = createProblem()
setOption(kp, ...) (set any number of
initializeProblem(kp, ...)
setCallbacks(kp, ...)
solveProblem(kp) (a single call, or a
restartProblem(kp, ...)
setOption(kp, ...) (set any number of
solveProblem(kp) (a single call, or a
parameters)
reverse communications loop)
parameters)
reverse communications loop)
For MIP problems, use mip_init_problem and mip_solve instead (see examples/minlp.jl).
1.3.3 Reverse Communications
If the application provides callback functions for making evaluations, then a single call to KTR_solve will return
the solution. Alternatively, the application can employ a reverse communications driver, with the following calling
sequence:
kp = createProblem()
setOption(kp, ...) (set any number of parameters)
initializeProblem(kp, ...)
while status != Optimal
status = solveProblem(kp, ...)
[...]
end
In this case, solveProblem returns
examples/qcqp_reversecomm.jl).
a
status
code
whenever
it
needs
evaluation
data
(see
1.4 Changing and reading solver parameters
Parameters cannot be set after KNITRO begins solving; i.e. after solveProblem is called. They may be set
again after restart_problem. In most cases, parameter values are not validated until initializeProblem or
solveProblem is called.
Note: The gradopt and hessopt user options must be set before calling initializeProblem, and cannot be
changed after calling these functions.
1.4.1 Programmatic Interface
Parameters may be set using their integer identifier, e.g.
setOption(kp, KTR_PARAM_OUTLEV, KTR_OUTLEV_ALL)
setOption(kp, KTR_PARAM_MIP_OUTINTERVAL, int32(1))
setOption(kp, KTR_PARAM_MIP_MAXNODES, int32(10000))
or using their string names, e.g.
setOption(kp, "mip_method", KTR_MIP_METHOD_BB)
setOption(kp, "algorithm", KTR_ALG_ACT_CG)
setOption(kp, "outmode", KTR_OUTMODE_SCREEN)
The full list of integer identifiers are available in src/ktr_defines.jl, and prefixed by KTR_PARAM_. For more
details, see the official documentation.
6
Chapter 1. Contents
KNITRO.jl Documentation, Release 0.0
1.5 Callbacks
Applications may define functions for evaluating problem elements given a current solution. This section of the
documentation details the function signatures expected for the callbacks.
1.5.1 eval_f
Returns the value of the objective function at the current solution x:
function eval_f(x::Vector{Float64})
# ...
return obj_value
end
# (length n) Current Solution
1.5.2 eval_g
Sets the value of the constraint functions g at the current solution x:
function eval_g(x::Vector{Float64},
cons::Vector{Float64})
# ...
# cons[1] = ...
# ...
# cons[prob.m] = ...
end
# (length n) Current Solution
# (length m) Constraint values g(x)
Note that the values of cons must be set “in-place”, i.e. the statement cons = zeros(prob.m) musn’t be done.
If you do want to create a new vector and allocate it to cons use cons[:], e.g. cons[:] = zeros(prob.m).
1.5.3 eval_grad_f
Sets the value of the gradient of the objective function at the current solution x:
function eval_grad_f(x::Vector{Float64},
grad::Vector{Float64})
# ...
# grad[1] = ...
# ...
# grad[prob.n] = ...
end
# (length n) Current Solution
# (length n) The gradient of the objective function
As with eval_g, you must set the values “in-place” for eval_grad_f.
1.5.4 eval_jac_g
This function returns the values of the Jacobian, evaluated at the non-negative indices, based on the sparsity structure
passed to KNITRO through initializeProblem. Julia is 1-based, in the sense that indexing always starts at 1
(unlike C, which starts at 0).:
function eval_jac_g(x::Vector{Float64},
jac::Vector{Float64})
# ...
# jac[1] = ...
1.5. Callbacks
# (length n) Current Solution
# (length nnzJ) The values of the Jacobian
7
KNITRO.jl Documentation, Release 0.0
# ...
# jac[nnzJ] = ... # where nnzJ = length(jac)
end
As for the previous two callbacks, all values must be set “in-place”. See the Ipopt documentation for a further description of the sparsity format followed by Ipopt ((row,column,value) triples).
1.5.5 eval_h
Similar to the Jacobian, except for the Hessian of the Lagrangian. See documentation for full details of the meaning
of everything.:
function eval_h(x::Vector{Float64},
# (length n) Current solution
lambda::Vector{Float64},
# (length n+m) Multipliers for each constraint
sigma::Float64,
# Lagrangian multiplier for objective
hess::Vector{Float64})
# (length nnzH) The values of the Hessian
# ...
# hess[1] = ...
# ...
# hess[nnzH] = ... # where nnzH = length(hess)
end
1.5.6 eval_hv
Computes the Hessian-of-the-Lagrangian-vector product, storing the result in the vector hv.:
function eval_hv(x::Vector{Float64},
# (length n) Current solution
lambda::Vector{Float64},
# (length n+m) Multipliers for each constraint
sigma::Float64,
# Lagrangian multiplier for objective
hess::Vector{Float64})
# (length n) Hessian-of-the-Lagrangian-vector product
# ...
# hv[1] = ...
# ...
# hv[end] = ...
end
1.6 JuMP interface
You can also work with KNITRO through JuMP, a domain-specific modeling language for mathematical programming
embedded in Julia.
Re-visiting the example, here’s what it’ll look like with JuMP:
using KNITRO, JuMP
m = Model(solver=KnitroSolver(options_file="knitro.opt"))
@defVar(m, x[1:3]>=0)
@setNLObjective(m, Min, 9.0 - 8.0*x[1] - 6.0*x[2] - 4.0*x[3]
+ 2.0*x[1]^2 + 2.0*x[2]^2 + x[3]^2
+ 2.0*x[1]*x[2] + 2.0*x[1]*x[3])
@addConstraint(m, x[1] + x[2] + 2.0*x[3] <= 3)
solve(m)
8
Chapter 1. Contents
KNITRO.jl Documentation, Release 0.0
1.6.1 Solver Parameters
You can also provide solver parameters to KNITRO in JuMP, e.g.
KnitroSolver() # default parameters
KnitroSolver(KTR_PARAM_ALG=5)
KnitroSolver(hessopt=1)
You can also provide the path to the options, or tuner, using the options_file or tuner_file keywords respectively, e.g.
KnitroSolver(options_file="tuner-fixed.opt")
KnitroSolver(tuner_file="tuner-explore.opt")
1.6. JuMP interface
9