×

Algorithms with conic termination for nonlinear optimization. (English) Zbl 0663.65062

The authors describe the implementation of algorithms for unconstrained optimization which have the property of minimizing conic objective functions in a finite number of steps, when line searches are exact. The basic properties of the conic function \(f(x)=f(x_ 0+s)=f_ 0+(1-a^ Ts)^{-1}g_ 0^ Ts+2^{-1}(1-a^ Ts)^{-2}s^ TAs\) are described, where \(g_ 0\), \(a\in {\mathbb{R}}^ n\) and A is a positive definite and symmetric \(n\times n\) matrix. The new presented algorithms are designed to minimize conic functions in n steps.
Therefore the authors begin by reviewing the three algorithms, 1) the conjugate gradient method, 2) the BFGS quasi-Newton method and 3) the limited BFGS method. These methods are translated into algorithms for the minimization of the conic function f, and a transition from conic objective functions to general nonlinear functions is given. Finally a general algorithm is discussed and numerical results obtained with the preferred implementation of the three new classes of algorithms are presented.
Reviewer: H.Benker

MSC:

65K05 Numerical mathematical programming methods
91B16 Utility theory
Full Text: DOI