We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

An Improved Gradient Descent Method for Optimization of Supervised Machine Learning Problems

by Dada Ibidapo Dare, Akinwale Adio Taofiki, Onashoga Adebukola S., Osinuga Idowu A.
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 183 - Number 20
Year of Publication: 2021
Authors: Dada Ibidapo Dare, Akinwale Adio Taofiki, Onashoga Adebukola S., Osinuga Idowu A.
10.5120/ijca2021921564

Dada Ibidapo Dare, Akinwale Adio Taofiki, Onashoga Adebukola S., Osinuga Idowu A. . An Improved Gradient Descent Method for Optimization of Supervised Machine Learning Problems. International Journal of Computer Applications. 183, 20 ( Aug 2021), 39-45. DOI=10.5120/ijca2021921564

@article{ 10.5120/ijca2021921564,
author = { Dada Ibidapo Dare, Akinwale Adio Taofiki, Onashoga Adebukola S., Osinuga Idowu A. },
title = { An Improved Gradient Descent Method for Optimization of Supervised Machine Learning Problems },
journal = { International Journal of Computer Applications },
issue_date = { Aug 2021 },
volume = { 183 },
number = { 20 },
month = { Aug },
year = { 2021 },
issn = { 0975-8887 },
pages = { 39-45 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume183/number20/32043-2021921564/ },
doi = { 10.5120/ijca2021921564 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:19:56.819603+05:30
%A Dada Ibidapo Dare
%A Akinwale Adio Taofiki
%A Onashoga Adebukola S.
%A Osinuga Idowu A.
%T An Improved Gradient Descent Method for Optimization of Supervised Machine Learning Problems
%J International Journal of Computer Applications
%@ 0975-8887
%V 183
%N 20
%P 39-45
%D 2021
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Gradient descent method is commonly used as an optimization algorithm for some machine learning problems such as regression analysis and classification problems. This method is highly applicable for real life of yearly demand- price commodity, agricultural products and Iris flowers. This study proposed the combination of Dai-Yuan (DY) and Saleh and Mustafa (SM) conjugate gradient methods for the optimization of supervised machine learning problems. Experiments were conducted on combined DY and SM with well-known conjugate gradient methods using a fixed learning rate. The efficiency of the combined methods and existing models was evaluated in term of number of iterations and processing time. The experimental results indicated that the combined conjugate gradient method had the better performance in term of number of iterations and processing time.

References
  1. Aliyu U. M.; Wah J.L.; Ibrahim S, “On application of three-Term Conjugate Gradient Method in Regression Analysis”, international Journal of computer Applications volume 2014, 102(8)
  2. Dai Y. H. and Yuan Y., (2001). “An efficient hybrid conjugate gradient method for unconstrained optimization”. Annals of Operations Research 103, 3347.
  3. Fletcher R. and Reeves C. M., Function minimization by conjugate gradients, Computer Journal 7, 149 – 154, 1964
  4. Robbins H. and Monro S., “A stochastic approximation method,” The Annals of Mathematical Statistics, pp. 400–407, 1951.
  5. Hamoda. M., Rivaie. M., Mamat. M. & Salleh, Z. “A new nonlinear conjugate gradient coefficient for unconstrained optimization”, Applied Mathematical Sciences, 9(37), 1813-1822, 2015
  6. Duchi. J, Hazan. E, and Singer. Y, “Adaptive subgradient methods for online learning and stochastic optimization,” Journal of Machine Learning Research, vol. 12, pp. 2121–2159, 2011.
  7. Kamilu Uba Kamfa, Mustafa Mamat, Abdelrhaman Abajhar, Mohd Rivaie, Puspaliza Binti Ghazali, Zabidin Salleh “Another Modified Conjugate Gradient Coefficient with Global Convergence Properties”, Applied Mathematical Sciences, 2015, vol 9, no 37, 1833-1844.
  8. Liu, Y., & Storey, C. Efficient generalized conjugate gradient algorithms, part 1: theory. Journal of Optimization Theory and Applications, 1991, 69(1), 129–137.
  9. Rivaie, M., Mamat, M., June, L. W., & Mohd, I. A new class of nonlinear conjugate gradient coefficients with global convergence properties. Applied Mathematics and Computation, 2012, 218(22), 11323–11332.
  10. Sulaiman, I. M., “Solving Fuzzy Nonlinear Equations with a New Class of Conjugate Gradient Method”. Malaysian Journal of Computing and Applied Mathematics, 2018, 1(1), 11–19.
  11. Yuan Sun; Zhihao Zhang; Zan Yang; Dan L. “Application of Logistic Regression with Fixed Memory Step Gradient Descent Method in Multi-Class Classification Problem.” The 2019 6th International Conference on Systems and Informatics (ICSAI 2019).
  12. Yu-Hong Dai. “Nonlinear Conjugate Gradient Methods”. Wiley Encyclopedia of Operations Research and Management Science (2010).
Index Terms

Computer Science
Information Sciences

Keywords

Conjugate gradient method machine learning regression analysis data classification.