CFP last date
20 December 2024
Reseach Article

On the Numerical Performance of a New Conjugate Gradient Parameter for Solving Unconstrained Optimization Problems

by Aliyu Usman Moyi, Onwuka Blessing
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 177 - Number 14
Year of Publication: 2019
Authors: Aliyu Usman Moyi, Onwuka Blessing
10.5120/ijca2019919538

Aliyu Usman Moyi, Onwuka Blessing . On the Numerical Performance of a New Conjugate Gradient Parameter for Solving Unconstrained Optimization Problems. International Journal of Computer Applications. 177, 14 ( Oct 2019), 1-3. DOI=10.5120/ijca2019919538

@article{ 10.5120/ijca2019919538,
author = { Aliyu Usman Moyi, Onwuka Blessing },
title = { On the Numerical Performance of a New Conjugate Gradient Parameter for Solving Unconstrained Optimization Problems },
journal = { International Journal of Computer Applications },
issue_date = { Oct 2019 },
volume = { 177 },
number = { 14 },
month = { Oct },
year = { 2019 },
issn = { 0975-8887 },
pages = { 1-3 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume177/number14/30962-2019919538/ },
doi = { 10.5120/ijca2019919538 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:45:49.293360+05:30
%A Aliyu Usman Moyi
%A Onwuka Blessing
%T On the Numerical Performance of a New Conjugate Gradient Parameter for Solving Unconstrained Optimization Problems
%J International Journal of Computer Applications
%@ 0975-8887
%V 177
%N 14
%P 1-3
%D 2019
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Nonlinear Conjugate gradient methods (CG) are widely used for solving unconstrained optimization problems. Their wide application in many Fields such as Engineering, Applied Sciences and Economics is due to their low memory requirements and global convergence properties. Numerous studies and modifications directed towards improving the efficiency of these methods have been conducted. In this paper, a new conjugate gradient parameter βk that possess convergence properties is presented. We also present preliminary numerical results to show the efficiency of the proposed method.

References
  1. Andrei, N.: An unconstrained optimization test functions collection, Adv. Model. Optim. 10(1), 147–161 (2008)
  2. Goldstein,A. A.On Steepest Descent SIAM J. Control. 3(1965) 147-151.
  3. Polak, E.,Ribiere, Note sur la convergence de directions conjugees, Rev. Francaise Inform. Recherche Opetationelle 3 (1969) 36-43.
  4. Byrd, R. H., Nocedal, J.: A tool for the analysis of quasi-Newton methods with pplication to unconstrained minimization. SIAM J. Numer. Anal. 26(3), 727–739 (1989)
  5. Ibrahim, M. A , Mamat M. and Leong W, J.”:BFGS Method: A New Search Direction. Sains Malaysiana 42(10) 2014, 1593-1599
  6. Dai, Y. H.: On the nonmonotone line search. J. Optim. Theory Appl. 112(2), 315–330 (2002)
  7. Armijo, L. Minimization of Functions having Lipschitz Continuous Partial Derivatives, Pacific J. Math 16(1966)1-3
  8. Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. program. 91 (2), 201–213 (2002)
  9. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2 (1), 21–42 (1992)
  10. Hestenes. M.R., Steifel, E. Method of Conjugate Gradient for Solving Linear Equations, J. Res. Nat. Bur. Stand.49 (1952) 409- 436.
  11. Fletcher, R. Reeves, C. Function Minimization by Conjugare Gradients, Comput. J. 7 (1964) 149-154.
  12. Dia, Y. H., Yuan, Y. Nonlinear Conjugate Gradient Method, Shanghai Scientific and Technical Publishers, Beijing , 1998
  13. Liu, Y., Storey, C. Efficient Generalized Conjugate Gradient Algorithms Part 1:Theory,J.Comput.Appl.Math. 69(1992) 129- 137.
  14. Panier, E.R., Tits, A.L.: Avoiding the maratos effect by means of a nonmonotone line search I. General constrained problems. SIAM J. Numer. Anal. 28(4), 1183–1195 (1991)
  15. Perry, J.M.: A class of conjugate gradient algorithms with a two step variable metric memory. Center for Mathematical Studies in Economies and Management Science. Evanston Illiois: Northwestern University Press. 1977
  16. Sun, W., Han, J., Sun, J.: Global convergence of nonmonotone descent methods for unconstrained optimization problems. J. Comput. Appl. Math. 146(1), 89–98 (2002)
  17. Shi, Z.J.,Wang, S., Xu, Z.: The convergence of conjugate gradient method with nonmonotone line search. Appl. Math. Comput. 217(5), 1921–1932 (2010)
  18. Yuan, G., Wei, Z.: Non monotone backtracking inexact BFGS method for regression analysis. Comm. Stat. Theory Meth. 42(2), 214–238 (2013)
Index Terms

Computer Science
Information Sciences

Keywords

Unconstrained Optimization Conjugate Gradient Method Conjugate Gradient Coefficient Global Convergence.