We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Improving Error Back Propagation Algorithm by using Cross Entropy Error Function and Adaptive Learning Rate

by Elsadek Hussien Ibrahim, Zahraa Elsayed Mohamed
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 161 - Number 8
Year of Publication: 2017
Authors: Elsadek Hussien Ibrahim, Zahraa Elsayed Mohamed
10.5120/ijca2017913242

Elsadek Hussien Ibrahim, Zahraa Elsayed Mohamed . Improving Error Back Propagation Algorithm by using Cross Entropy Error Function and Adaptive Learning Rate. International Journal of Computer Applications. 161, 8 ( Mar 2017), 5-9. DOI=10.5120/ijca2017913242

@article{ 10.5120/ijca2017913242,
author = { Elsadek Hussien Ibrahim, Zahraa Elsayed Mohamed },
title = { Improving Error Back Propagation Algorithm by using Cross Entropy Error Function and Adaptive Learning Rate },
journal = { International Journal of Computer Applications },
issue_date = { Mar 2017 },
volume = { 161 },
number = { 8 },
month = { Mar },
year = { 2017 },
issn = { 0975-8887 },
pages = { 5-9 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume161/number8/27166-2017913242/ },
doi = { 10.5120/ijca2017913242 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:06:51.444451+05:30
%A Elsadek Hussien Ibrahim
%A Zahraa Elsayed Mohamed
%T Improving Error Back Propagation Algorithm by using Cross Entropy Error Function and Adaptive Learning Rate
%J International Journal of Computer Applications
%@ 0975-8887
%V 161
%N 8
%P 5-9
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Improving the efficiency and convergence rate of the Multilayer Backpropagation Neural Network Algorithms is an important area of research. The last researches have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of cross entropy function. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically, one is the output of the learning system and the other is the target. In this paper, improving the efficiency and convergence rate of multilayer Backpropagation (BP) Neural Networks was proposed. The usual mean square error (MSE) minimization principle is substituted by the minimization of entropy error function (EEM) of the differences between the multilayer perceptions output and the desired target. On this method improving the convergence rate of the backpropagation algorithm is also by adapting the learning rate the determined learning rate is different for each epoch and depends on the weights and gradient values of the previous one. Experimental results show that the proposed method considerably improves the convergence rates of the backpropagation algorithm.

References
  1. Duffner, S., and Garcia, C. (2007). An online backpropagation algorithm with validation error-based adaptive learning rate. Artificial Neural Networks–ICANN 2007, 249-258.
  2. Guangjun, S., Jialin, Z., and Zhenlong, S. (2008). The research of dynamic change learning rate strategy in BP neural network and application in network intrusion detection. Proceedings of the 3rd International Conference on Innovative Computing Information and Control, Jun. 18-20, IEEE Xplore Press, Dalian, Liaoning, pp: 513-513.
  3. Hongmei, S., and Gaofeng, Z. (2009). A new BP algorithm with adaptive momentum for FNNs training.Proceedings of the WRI Global Congress on Intelligent Systems, (IS’ 09), IEEE Computer Society, pp: 16-20.
  4. Iranmanesh, S., and Mahdavi, M. A. (2009). A differential adaptive learning rate method for back-propagation neural networks. World Academy of Science, Engineering and Technology, 38, 289-292.
  5. Kathirvalavakumar, T., and Subavathi, S. J. (2012). Modified backpropagation algorithm with adaptive learning rate based on Citra Ramadhena et al. differential errors and differential functional constraints, Proceedings of the International Conference on Pattern Recognition, Informatics and Medical Engineering, Mar. 21-23, IEEE Xplore Press, Salem, Tamilnadu, pp: 61-67.
  6. Li, Y., Fu, Y., Li, H., and Zhang, S.-W. (2009). the improved training algorithm of back propagation neural network with self adaptive learning rate. Proceedings of the International Conference on Computational Intelligence and Natural Computing, Jun 6-7, Wuhan, China, IEEE Computer Society, pp: 73-76.
  7. Nasr, G. E., Badr, E. A., and Joun, C. (2002). Cross entropy error in neural networks: forecasting gasoline demand. In proceedings of FLAIRS-02, 381-384: AAAI Press.
  8. Norhamreeza Abdul Hamid, N. A. H., Nazri Mohd Nawi, N. M. N., Rozaida Ghazali, R. G., and Mohd Najib Mohd Salleh, M. N. M. S. (2011). Accelerating Learning Performance of Back Propagation Algorithm by Using Adaptive Gain Together with Adaptive Momentum and Adaptive Learning Rate on Classification Problems. International Journal of Software Engineering and Its Applications, 5(4), 31-44. Network.
  9. Rady, H. (2011). Reyni’s entropy and mean square error for improving the convergence of multilayer backprobagation neural networks: a comparative study. International journal of electrical &computer sciences IJECS-IJENS vol: 11 no: 05.
  10. Shamsuddin, S. M., Sulaiman, M. N., and Darus, M. (2001). An improved error signal for the backpropagation model for classification problems. International Journal of Computer Mathematics, 76(3), 297-305.
  11. Subavathi, S. J., and Kathirvalavakumar, T. (2011). Adaptive modified backpropagation algorithm based on differential errors. International Journal of Computer Science, Engineering and Applications (IJCSEA), 1 (5), 21-34.
  12. Xiaoyuan, L., Bin, Q., and Lu, W. (2009). A new improved BP neural network algorithm. Paper presented at the 2009 2nd International Conference on Intelligent Computing Technology and Automation, ICICTA 2009, October 10, 2009 - October 11, 2009, Changsha, Hunan, China, 19-22.
  13. Yam, J. Y. F., and Chow, T. W. S. (2000). A weight initialization method for improving training speed in feed forward neural network. Neurocomputing, 30(1), 219-232.
  14. Yu, C.-C., and Liu, B.-D. (2002). A backpropagation algorithm with adaptive learning rate and momentum coefficient Proceedings of the International Joint Conference on Neural Networks, May 12-17, IEEE Xplore Press, Honolulu, HI, pp: 1218-1223.
  15. Yu, X. H., and Chen, G. A. (1997). Efficient backpropagation learning using optimal learning rate and momentum. Neural Networks, 10(3), 517-527.
  16. Zhang, X. H., Ren, F. J., and Jiang, Y. C. (2012). An Improved BP Algorithm Based on Steepness Factor and Adaptive Learning Rate Adjustment Factor. Applied Mechanics and Materials, 121, 705-709.
Index Terms

Computer Science
Information Sciences

Keywords

Artificial neural network back propagation mean square error entropy error learning rate.