×

A new super-memory gradient method with curve search rule. (English) Zbl 1081.65058

An unconstrained minimization problem of the form \[ \text{minimize }f(x),\quad x\in\mathbb{R}^n \] is considered. It is assumed that \(f\) is a continuously differentiable function satisfying the following conditions:
(i) \(f\) has a lower bound on the level set \(L_0= \{x\in\mathbb{R}^n\mid f(x)\leq f(x_0)\}\), where \(x_0\) is given;
(ii) the gradient of \(f\) is uniformly continuous in an open convex set \(B\) that contains \(L_0\);
(iii) the gradient of \(f\) is Lipschitz continuous in \(B\).
A new super-memory gradient method with curve search rule for solving the unconstraint minimization problem under the above assumptions is described. The method uses previous multi-step iterative information and curve search rule to generate new iterative points at each iteration. The suggested method has global convergence and linear convergence rate. Numerical experience with the method presented at the end of the paper shows a good computational effectiveness in practical computations.

MSC:

65K05 Numerical mathematical programming methods
90C30 Nonlinear programming
Full Text: DOI

References:

[1] Botsaris, C. A., Differential gradient methods, J. Math. Anal. Appl., 63, 177-198 (1978) · Zbl 0405.65040
[2] Botsaris, C. A., A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix, J. Math. Anal. Appl., 63, 396-411 (1978) · Zbl 0383.49024
[3] Botsaris, C. A., A class of methods for unconstrained minimization based on stable numerical integration techniques, J. Math. Anal. Appl., 63, 729-749 (1978) · Zbl 0405.65041
[4] Botsaris, C. A.; Jacobson, D. H., A Newton-type curvilinear search method for optimization, J. Math. Anal. Appl., 54, 217-229 (1976) · Zbl 0331.49026
[5] Cohen, A. I., Stepsize analysis for descent methods, J. Optim. Theory Appl., 33, 2, 187-205 (1981) · Zbl 0421.49030
[6] Dai, Y. H.; Yuan, Y., Convergence properties of Beale-Powell restart algorithm, Sci. China, 41, 11, 1142-1150 (1998) · Zbl 0919.65042
[7] Dixon, L. C.W., Conjugate directions without line searches, J. Inst. Math. Appl., 11, 317-328 (1973) · Zbl 0259.65060
[8] Flam, S. D., Solving convex programming by means of ordinary differential equations, Math. Oper. Res., 17, 2, 290-302 (1992) · Zbl 0764.90068
[9] Gilbert, J. C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2, 1, 21-42 (1992) · Zbl 0767.90082
[10] Ford, J. A.; Tharmlikit, S., New implicit updates in multi-step quasi-Newton methods for unconstrained optimization, J. Comput. Appl. Math., 152, 133-146 (2003) · Zbl 1025.65035
[11] Ford, J. A.; Moghrabi, I. A., Using function-values in multi-step quasi-Newton methods, J. Comput. Appl. Math., 66, 201-211 (1996) · Zbl 0856.65073
[12] Ford, J. A.; Moghrabi, I. A., Mimimum curvature multistep quasi-Newton methods, Comput. Math. Appl., 31, 4/5, 179-186 (1996) · Zbl 0874.65046
[13] Ford, J. A.; Moghrabi, I. A., Multi-step quasi-Newton methods for optimization, J. Comput. Appl. Math., 50, 305-323 (1994) · Zbl 0807.65062
[14] Grippo, L.; Lucidi, S., A globally convergent version of the Polak-Ribière conjugate gradient method, Math. Prog., 78, 375-391 (1997) · Zbl 0887.90157
[15] Grippo, L.; Lampariello, F.; Lucidi, S., A class of nonmonotone stability methods in unconstrained optimization, Numer. Math., 62, 779-805 (1991) · Zbl 0724.90060
[16] Himmelblau, D. M., Applied Nonlinear Programming (1972), McGraw-Hill Book Company · Zbl 0521.93057
[17] Huang, H. Y.; Chambliss, J. P., Quadratically convergent algorithms and one-dimensional search schemes, J. Optim. Theory Appl., 11, 2, 175-188 (1973) · Zbl 0238.65033
[18] Zang, I., A new arc algorithm for unconstrained optimization, Math. Prog., 15, 36-52 (1978) · Zbl 0392.90076
[19] McCormick, G. P., An arc method for nonlinear programming, SIAM J. Contr., 13, 1194-1216 (1975) · Zbl 0338.90048
[20] Nocedal, J.; Wright, J. S., Numerical Optimization (1999), Springer-Verlag, Inc.: Springer-Verlag, Inc. New York · Zbl 0930.65067
[21] Shi, Z. J.; Shen, J., A gradient-related algorithm with inexact line searches, J. Comput. Appl. Math., 170, 349-370 (2004) · Zbl 1050.65061
[22] Z.J. Shi, J. Shen, A new descent algorithm with curve search rule, Appl. Math. Comput., in press.; Z.J. Shi, J. Shen, A new descent algorithm with curve search rule, Appl. Math. Comput., in press. · Zbl 1069.65065
[23] Shi, Z. J., Restricted PR conjugate gradient method and its global convergence (in Chinese), Adv. Math., 31, 1, 47-55 (2002) · Zbl 1264.90179
[24] Syman, J. A., A new and dynamic method for unconstrained minimization, Appl. Math. Modell., 6, 449-462 (1982) · Zbl 0501.65026
[25] Schropp, J., A note on minimization problems and multistep methods, Numer. Math., 78, 87-101 (1997) · Zbl 0884.34021
[26] van Wyk, D. J., Differential optimization techniques, Appl. Math. Modell., 8, 419-424 (1984) · Zbl 0555.90092
[27] Vrahatis, M. N.; Androulakis, G. S.; Lambrinos, J. N.; Magoulas, G. D., A class of gradient unconstrained minimization algorithms with adaptive stepsize, J. Comput. Appl. Math., 114, 367-386 (2000) · Zbl 0958.65072
[28] Wu, X. Y.; Xia, J. L.; Ouyang, Z. X., Note on global convergence of ODE methods for unconstrained optimization, Appl. Math. Comput., 125, 311-315 (2002) · Zbl 1032.90047
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.