International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 183 - Number 20 |
Year of Publication: 2021 |
Authors: Dada Ibidapo Dare, Akinwale Adio Taofiki, Onashoga Adebukola S., Osinuga Idowu A. |
10.5120/ijca2021921564 |
Dada Ibidapo Dare, Akinwale Adio Taofiki, Onashoga Adebukola S., Osinuga Idowu A. . An Improved Gradient Descent Method for Optimization of Supervised Machine Learning Problems. International Journal of Computer Applications. 183, 20 ( Aug 2021), 39-45. DOI=10.5120/ijca2021921564
Gradient descent method is commonly used as an optimization algorithm for some machine learning problems such as regression analysis and classification problems. This method is highly applicable for real life of yearly demand- price commodity, agricultural products and Iris flowers. This study proposed the combination of Dai-Yuan (DY) and Saleh and Mustafa (SM) conjugate gradient methods for the optimization of supervised machine learning problems. Experiments were conducted on combined DY and SM with well-known conjugate gradient methods using a fixed learning rate. The efficiency of the combined methods and existing models was evaluated in term of number of iterations and processing time. The experimental results indicated that the combined conjugate gradient method had the better performance in term of number of iterations and processing time.