CFP last date
20 December 2024
Reseach Article

On the Optimal Learning Rate Size for the Generalization Ability of Artificial Neural Networks in Forecasting TCP/IP Traffic Trends

by Vusumuzi Moyo, Khulumani Sibanda
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 105 - Number 4
Year of Publication: 2014
Authors: Vusumuzi Moyo, Khulumani Sibanda
10.5120/18363-9506

Vusumuzi Moyo, Khulumani Sibanda . On the Optimal Learning Rate Size for the Generalization Ability of Artificial Neural Networks in Forecasting TCP/IP Traffic Trends. International Journal of Computer Applications. 105, 4 ( November 2014), 9-14. DOI=10.5120/18363-9506

@article{ 10.5120/18363-9506,
author = { Vusumuzi Moyo, Khulumani Sibanda },
title = { On the Optimal Learning Rate Size for the Generalization Ability of Artificial Neural Networks in Forecasting TCP/IP Traffic Trends },
journal = { International Journal of Computer Applications },
issue_date = { November 2014 },
volume = { 105 },
number = { 4 },
month = { November },
year = { 2014 },
issn = { 0975-8887 },
pages = { 9-14 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume105/number4/18363-9506/ },
doi = { 10.5120/18363-9506 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:36:48.313844+05:30
%A Vusumuzi Moyo
%A Khulumani Sibanda
%T On the Optimal Learning Rate Size for the Generalization Ability of Artificial Neural Networks in Forecasting TCP/IP Traffic Trends
%J International Journal of Computer Applications
%@ 0975-8887
%V 105
%N 4
%P 9-14
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Artificial Neural Networks (ANNs) have attracted increasing attention from researchers in many fields. One area in which ANNs have featured prominently is in the forecasting of TCP/IP network traffic trends. Their ability to model almost any kind of function regardless of its degree of nonlinearity, positions them as good candidates for predicting self-similar time series such as TCP/IP traffic. Inspite of this, one of the most difficult and least understood tasks in the design of ANN models is the selection of the most appropriate size of the learning rate. Although some guidance in the form of heuristics is available for the choice of this parameter, none have been universally accepted. In this paper we empirically investigate various sizes of learning rates with the aim of determining the optimum learning rate size for generalization ability of an ANN trained on forecasting TCP/IP network traffic trends. MATLAB Version 7. 4. 0. 287's Neural Network toolbox version 5. 0. 2 (R2007a) was used for our experiments. We found from the simulation experiments that, generally small learning rates produced consistent and better results, whereas large learning rates appeared to cause oscillations and inconsistent results. Depending on the difficulty of the problem at hand, it is advisable to set the learning rate to 0. 1 for the standard Backpropagation algorithm and to either 0. 1 or 0. 2 if used in conjunction with the momentum term of 0. 5 or 0. 6. We advise minimal use of the momentum term as it greatly interferes with the training process of ANNs. While experimental results cannot cover all practical situations, our results do help to explain common behavior which does not agree with some theoretical expectations.

References
  1. S. Chabaa, "Identification and Prediction of Internet Traffic Using Artificial Neural Networks," J. Intell. Learn. Syst. Appl. , vol. 02, no. 03, pp. 147–155, 2010.
  2. R. Aamodt, "Using Artificial Neural Networks To Forecast Financial Time Series," Norwegian university of science and technology, 2010.
  3. H. Tong, C. Li, J. He, and Y. Chen, "Internet Traffic Prediction by W-Boost?: Classification and Regression ," Neural Comput. , vol. 2, no. 973, pp. 397–402, 2005.
  4. Attoh-Okine, N. O. , 1999. Analysis of learning rate and momentum term in backpropagation neural network algorithm trained to predict pavement performance. Advances in Engineering Software, 30(4), pp. 291–302. Available at: http://linkinghub. elsevier. com/retrieve/pii/S0965997898000714.
  5. Chen, C. J. & Miikkulainen, R. , 2001. Creating Melodies with Evolving Recurrent Neural Networks. In Proceedings of the 2001 International Joint Conference on Neural Networks, 2(2), pp. 20–60.
  6. E. Richards, "Generalization in Neural Networks,Experiments in Speech Recognition," University of Colarado, 1991.
  7. Wilson, D. R. & Martinez, T. R. , 2001. The need for small learning rates on large problems. IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No. 01CH37222), 1, pp. 115–119. Available at: http://ieeexplore. ieee. org/lpdocs/epic03/wrapper. htm?arnumber=939002
  8. S. Haykin, Neural Networks: A comprehensive foundation, Second. Pearson, 1999, pp. 2–3
  9. Foody, G. , Lucas, R. & Curran, M. , 1996. Estimation of the areal extent of land cover classes that only occur at a sub-pixel level. Canadian Journal of Remote Sensing, 22(4), pp. 428–432.
  10. Kavzoglu, T. , 1999. Determining Optimum Structure for Artificial Neural Networks. In In proceddings of the 25th Annual Technical Conference and Exhibition of the Remote Sensing Society. Cardiff, UK, pp. 675–682
  11. Swinger, K. , 1996. Financial prediction. Journal of Neural Computing and Applications, 4(4), pp. 192–197.
  12. Ardö, J. , Pilesjö, P. & Skidmore, A. , 1997. Neural networks, multitemporal Landsat Thematic Mapper data and topographic data to classify forest damages in the Czech Republic. Canadian Journal of Remote Sensing, 23(2), pp. 217–229.
Index Terms

Computer Science
Information Sciences

Keywords

Generalization ability Artificial Neural Networks and Learning rate size