×

A new hybrid three-term LS-CD conjugate gradient in solving unconstrained optimization problems. (English) Zbl 07858291

Summary: The Conjugate Gradient (CG) method is renowned for its rapid convergence in optimization applications. Over the years, several modifications to CG methods have emerged to improve computational efficiency and tackle practical challenges. This paper presents a new three-term hybrid CG method for solving unconstrained optimization problems. This algorithm utilizes a search direction that combines Liu-Storey (LS) and Conjugate Descent (CD) CG coefficients and standardizes it using a spectral which acts as a scheme for the choices of the conjugate parameters. This resultant direction closely approximates the memoryless Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton direction, known for its bounded nature and compliance with the sufficient descent condition. The paper establishes the global convergence under standard Wolfe conditions and some appropriate assumptions. Additionally, the numerical experiments were conducted to emphasize the robustness and superior efficiency of this hybrid algorithm in comparison to existing approaches.

MSC:

90C30 Nonlinear programming

Software:

L-BFGS; minpack
Full Text: DOI

References:

[1] A. B. Abubakar, P. Kumam, M. Malik, P. Chaipunya & A. H. Ibrahim (2021). A hybrid FR-DY conjugate gradient algorithm for unconstrained optimization with application in portfolio selection. AIMS Mathematics, 6(6), 6506-6527. https://doi.org/10.3934/math.2021383. · Zbl 1484.90138 · doi:10.3934/math.2021383
[2] A. B. Abubakar, P. Kumam, M. Malik & A. H. Ibrahim (2022). A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems. Math-ematics and Computers in Simulation, 201, 640-657. https://doi.org/10.1016/j.matcom.2021.05. 038. · Zbl 1540.90255 · doi:10.1016/j.matcom.2021.05.038
[3] N. Andrei (2008). Another hybrid conjugate gradient algorithm for unconstrained optimiza-tion. Numerical Algorithm, 47, 143-156. https://doi.org/10.1007/s11075-007-9152-9. · Zbl 1141.65041 · doi:10.1007/s11075-007-9152-9
[4] N. Andrei (2008). A hybrid conjugate gradient algorithm for unconstrained optimization as convex combination of Hestenes-Steifel and Dai-Yuan. Studies in Informatics and Control, 17(1), 57-70.
[5] N. Andrei (2008). An unconstrained optimization test functions collection. Advanced Model-ing and Optimization, 10(1), 147-161. https://citeseerx.ist.psu.edu/document?repid=rep1& type=pdf&doi=0aa7264b4c5b14ddf091bfdc328c4fcb4049d4f4. · Zbl 1161.90486
[6] N. Andrei (2009). Hybrid conjugate gradient algorithm for unconstrained optimization. Journal of Optimization Theory and Applications, 141(2), 249-264. https://doi.org/10.1007/ s10957-008-9505-0. · Zbl 1168.90017 · doi:10.1007/s10957-008-9505-0
[7] N. Andrei (2013). A simple three-term conjugate gradient algorithm for unconstrained op-timization. Journal of Computational and Applied Mathematics, 241, 19-29. https://doi.org/10. 1016/j.cam.2012.10.002. · Zbl 1258.65056 · doi:10.1016/j.cam.2012.10.002
[8] N. Andrei (2020). Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. Springer, Cham 1st edition. https://doi.org/10.1007/978-3-030-42950-8. · Zbl 1514.90250 · doi:10.1007/978-3-030-42950-8
[9] S. Babaie-Kafaki (2013). A hybrid conjugate gradient method based on quadratic relaxation of Dai-Yuan hybrid conjugate gradient parameter. Optimization, 62(7), 929-941. https://doi. org/10.1080/02331934.2011.611512. · Zbl 1278.65085 · doi:10.1080/02331934.2011.611512
[10] S. Babaie-Kafaki, M. Fatemi & N. Mahdavi-Amiri (2011). Two effective hybrid conjugate gradient algorithms based on modified BFGS updates. Numerical Algorithms, 58(3), 315-331. https://doi.org/10.1007/s11075-011-9457-6. · Zbl 1277.90149 · doi:10.1007/s11075-011-9457-6
[11] S. Babaie-Kafaki, R. Ghanbari & N. Mahdavi-Amiri (2010). Two new conjugate gradient methods based on modified secant equations. Journal of Computational and Applied Mathemat-ics, 234(5), 1374-1386. https://doi.org/10.1016/j.cam.2010.01.052. · Zbl 1202.65071 · doi:10.1016/j.cam.2010.01.052
[12] S. Babaie-Kafaki & N. Mahdavi-Amiri (2013). Two modified hybrid conjugate gradient meth-ods based on a hybrid secant equation. Mathematical Modelling and Analysis, 18(1), 32-52. https://doi.org/10.3846/13926292.2013.756832. · Zbl 1264.49034 · doi:10.3846/13926292.2013.756832
[13] Y. H. Dai (2001). New properties of nonlinear conjugate gradient method. Numerische Math-ematik, 89, 83-98. https://doi.org/10.1007/PL00005464. · Zbl 1006.65063 · doi:10.1007/PL00005464
[14] Y. H. Dai & Y. Yuan (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on Optimization, 10(1), 177-182. https://doi.org/10. 1137/S1052623497318992. · Zbl 0957.65061 · doi:10.1137/S1052623497318992
[15] J. Deepho, A. B. Abubakar, M. Malik & I. K. Argyros (2022). Solving unconstrained optimiza-tion problems via hybrid CD-DY conjugate gradient methods with applications. Journal of Computational and Applied Mathematics, 405, Article ID: 113823. https://doi.org/10.1016/j. cam.2021.113823. · Zbl 1480.65145 · doi:10.1016/j.cam.2021.113823
[16] J. E. Dennis Jr & J. J. Moré (1977). Quasi-Newton methods, motivation and theory. SIAM Review, 19(1), 46-89. https://www.jstor.org/stable/2029325. · Zbl 0356.65041
[17] S. S. Djordjević (2019). New hybrid conjugate gradient method as a convex combination of LS and FR methods. Acta Mathematica Scientia, 39, 214-228. https://doi.org/10.1007/ s10473-019-0117-6. · Zbl 1499.90217 · doi:10.1007/s10473-019-0117-6
[18] E. D. Dolan & J. J. Moré (2002). Benchmarking optimization software with performance profiles. Mathematical Programming, 91(2), 201-213. https://doi.org/10.1007/s101070100263. · Zbl 1049.90004 · doi:10.1007/s101070100263
[19] X. L. Dong, D. R. Han, R. Ghanbari, X. L. Li & Z. F. Dai (2017). Some new three-term Hestenes-Stiefel conjugate gradient methods with affine combination. Optimization, 66(5), 759-776. https://doi.org/10.1080/02331934.2017.1295242. · Zbl 1375.90281 · doi:10.1080/02331934.2017.1295242
[20] R. Fletcher & C. M. Reeves (1964). Function minimization by conjugate gradients. The Com-puter Journal, 7(2), 149-154. https://doi.org/10.1093/comjnl/7.2.149. · Zbl 0132.11701 · doi:10.1093/comjnl/7.2.149
[21] R. Fletcher (2013). Practical Methods of Optimization. John Wiley & Sons, New York 2nd edition. https://doi.org/10.1002/9781118723203. · doi:10.1002/9781118723203
[22] W. W. Hager & H. Zhang (2006). A survey of nonlinear conjugate gradient methods. Pacific Journal of Optimization, 2(1), 35-58. https://people.clas.ufl.edu/hager/files/cg_survey.pdf. · Zbl 1117.90048
[23] M. R. Hestenes & E. Stiefel (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards, 49(6), 409-436. https://doi.org/10.6028/ JRES.049.044. · Zbl 0048.09901 · doi:10.6028/JRES.049.044
[24] M. Jamil & X. S. Yang (2013). A literature survey of benchmark functions for global optimiza-tion problems. International Journal of Mathematical Modelling and Numerical Optimisation, 4(2), 150-194. https://doi.org/10.1504/IJMMNO.2013.055204. · Zbl 1280.65053 · doi:10.1504/IJMMNO.2013.055204
[25] P. Kumam, A. B. Abubakar, M. Malik, A. H. Ibrahim, N. Pakkaranang & B. Panyanak (2023). A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applica-tions in motion control and image recovery. Journal of Computational and Applied Mathematics, 433, Article ID: 115304. https://doi.org/10.1016/j.cam.2023.115304. · Zbl 1517.65049 · doi:10.1016/j.cam.2023.115304
[26] M. Li (2018). A modified Hestense-Stiefel conjugate gradient method close to the mem-oryless BFGS quasi-Newton method. Optimization Methods and Software, 33(2), 336-353. https://doi.org/10.1080/10556788.2017.1325885. · Zbl 1397.90361 · doi:10.1080/10556788.2017.1325885
[27] M. Li (2020). A three term Polak-Ribiére-Polyak conjugate gradient method close to the memoryless BFGS quasi-Newton method. Journal of Industrial and Management Optimization, 16(1), 245-260. https://doi.org/10.3934/jimo.2018149. · Zbl 1438.90326 · doi:10.3934/jimo.2018149
[28] J. K. Liu, Y. M. Feng & L. M. Zou (2018). Some three-term conjugate gradient methods with the inexact line search condition. Calcolo, 55(2), Article ID: 16. https://doi.org/10.1007/ s10092-018-0258-3. · Zbl 1393.90116 · doi:10.1007/s10092-018-0258-3
[29] J. K. Liu & S. J. Li (2014). New hybrid conjugate gradient method for unconstrained opti-mization. Applied Mathematics and Computation, 245, 36-43. https://doi.org/10.1016/j.amc. 2014.07.096. · Zbl 1335.90114 · doi:10.1016/j.amc.2014.07.096
[30] Y. Liu & C. Storey (1991). Efficient generalized conjugate gradient algorithms, part 1: The-ory. Journal of Optimization Theory and Applications, 69, 129-137. https://doi.org/10.1007/ BF00940464. · Zbl 0702.90077 · doi:10.1007/BF00940464
[31] M. Malik, I. M. Sulaiman, A. B. Abubakar, G. Ardaneswari & Sukono (2023). A new fam-ily of hybrid three-term conjugate gradient method for unconstrained optimization with application to image restoration and portfolio selection. AIMS Mathematics, 8(1), 1-28. https://doi.org/10.3934/math.2023001. · doi:10.3934/math.2023001
[32] J. J. Moré, B. S. Garbow & K. E. Hillstrom (1981). Testing unconstrained optimization soft-ware. ACM Transactions on Mathematical Software, 7(1), 17-41. https://doi.org/10.1145/ 355934.355936. · Zbl 0454.65049 · doi:10.1145/355934.355936
[33] J. Nocedal (1980). Updating quasi-Newton matrices with limited storage. Mathematics of Computation, 35(151), 773-782. https://doi.org/10.1090/S0025-5718-1980-0572855-7. · Zbl 0464.65037 · doi:10.1090/S0025-5718-1980-0572855-7
[34] E. Polak & G. Ribiere (1969). Note sur la convergence de méthodes de directions conjugées. Revue Française D’informatique et de Recherche Opérationnelle. Série Rouge, 3(R1), 35-43. http: //www.numdam.org/item/M2AN_1969__3_1_35_0/. · Zbl 0174.48001
[35] B. T. Polyak (1969). The conjugate gradient method in extremal problems. USSR Com-putational Mathematics and Mathematical Physics, 9(4), 94-112. https://doi.org/10.1016/ 0041-5553(69)90035-4. · Zbl 0229.49023 · doi:10.1016/0041-5553(69)90035-4
[36] M. J. D. Powell (1984). Nonconvex minimization calculations and the conjugate gradient method. In D. F. Griffiths (Ed.), Numerical Analysis, pp. 122-141. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0099521. · Zbl 0531.65035 · doi:10.1007/BFb0099521
[37] N. Salihu, M. R. Odekunle, M. Y. Waziri & A. S. Halilu (2020). A new hybrid conjugate gradient method based on secant equation for solving large scale unconstrained optimiza-tion problems. Iranian Journal of Optimization, 12(1), 33-44. https://dorl.net/dor/20.1001.1. 25885723.2020.12.1.4.0.
[38] D. F. Shanno (1978). Conjugate gradient methods with inexact searches. Mathematics of Operations Research, 3(3), 244-256. https://www.jstor.org/stable/3689494. · Zbl 0399.90077
[39] L. Wang, M. Cao, F. Xing & Y. Yang (2020). The new spectral conjugate gradient method for large-scale unconstrained optimisation. Journal of Inequalities and Applications, 2020(1), Article ID: 111. https://doi.org/10.1186/s13660-020-02375-z. · Zbl 1503.90136 · doi:10.1186/s13660-020-02375-z
[40] X. Xu & F. Y. Kong (2016). New hybrid conjugate gradient methods with the gener-alized Wolfe line search. SpringerPlus, 5(1), Article ID: 881. https://doi.org/10.1186/ s40064-016-2522-9. · doi:10.1186/s40064-016-2522-9
[41] X. Yang, Z. Luo & X. Dai (2013). A global convergence of LS-CD hybrid conjugate gradient method. Advances in Numerical Analysis, 2013, Article ID: 517452. https://doi.org/10.1155/ 2013/517452. · Zbl 1292.65066 · doi:10.1155/2013/517452
[42] G. Zoutendijk (1970). Nonlinear programming computational methods. In J. Abadie (Ed.), Integer and Nonlinear Programming, pp. 37-86. North-Holland, Amsterdam. · Zbl 0336.90057
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.