CFP last date
20 January 2025
Reseach Article

Article:A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting

by Ratnadip Adhikari, R. K. Agrawal
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 32 - Number 7
Year of Publication: 2011
Authors: Ratnadip Adhikari, R. K. Agrawal
10.5120/3913-5505

Ratnadip Adhikari, R. K. Agrawal . Article:A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting. International Journal of Computer Applications. 32, 7 ( October 2011), 1-8. DOI=10.5120/3913-5505

@article{ 10.5120/3913-5505,
author = { Ratnadip Adhikari, R. K. Agrawal },
title = { Article:A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting },
journal = { International Journal of Computer Applications },
issue_date = { October 2011 },
volume = { 32 },
number = { 7 },
month = { October },
year = { 2011 },
issn = { 0975-8887 },
pages = { 1-8 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume32/number7/3913-5505/ },
doi = { 10.5120/3913-5505 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:18:32.295490+05:30
%A Ratnadip Adhikari
%A R. K. Agrawal
%T Article:A Homogeneous Ensemble of Artificial Neural Networks for Time Series Forecasting
%J International Journal of Computer Applications
%@ 0975-8887
%V 32
%N 7
%P 1-8
%D 2011
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Enhancing the robustness and accuracy of time series forecasting models is an active area of research. Recently, Artificial Neural Networks (ANNs) have found extensive applications in many practical forecasting problems. However, the standard backpropagation ANN training algorithm has some critical issues, e.g. it has a slow convergence rate and often converges to a local minimum, the complex pattern of error surfaces, lack of proper training parameters selection methods, etc. To overcome these drawbacks, various improved training methods have been developed in literature; but, still none of them can be guaranteed as the best for all problems. In this paper, we propose a novel weighted ensemble scheme which intelligently combines multiple training algorithms to increase the ANN forecast accuracies. The weight for each training algorithm is determined from the performance of the corresponding ANN model on the validation dataset. Experimental results on four important time series depicts that our proposed technique reduces the mentioned shortcomings of individual ANN training algorithms to a great extent. Also it achieves significantly better forecast accuracies than two other popular statistical models.

References
  1. G.E.P. Box, G.M. Jenkins, Time Series Analysis: Forecasting and Control, 3rd ed. Holden-Day, California, 1970.
  2. G.P. Zhang, “Time series forecasting using a hybrid ARIMA and neural network model,” Neurocomputing 50, pp.159–175, 2003
  3. G.P. Zhang, “A neural network ensemble method with jittered training data for time series forecasting,” Information Sciences 177, pp. 5329–5346, 2007.
  4. G. Zhang, B.E. Patuwo, M.Y. Hu, “Forecasting with artificial neural networks: The state of the art,” International Journal of Forecasting 14, pp.35–62, 1998.
  5. J. Kamruzzaman, R. Begg, R. Sarker, Artificial Neural Networks in Finance and Manufacturing, Idea Group Publishing, 2006.
  6. M. Adya, F. Collopy, “How effective are neural networks at forecasting and prediction? A review and evaluation,” Journal of Forecasting 17, pp. 481–495, 1998.
  7. D.E. Rumelhart, G.E. Hinton, R. J. Williams, “Learning representations by back-propagating errors,” Nature 323 (6188), pp. 533-536, 1986.
  8. M. Hagan, M. Menhaj, "Training feedforward networks with the marquardt algorithm," IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 989–993, November 1994.
  9. M. Reidmiller, H. Braun, "A direct adaptive method for faster backpropagation learning: The rprop algorithm," In Proceedings of the IEEE Int. Conference on Neural Networks (ICNN), San Francisco, pp. 586–591, 1993.
  10. M.F. Moller, "A scaled conjugate gradient algorithm for fast supervised learning," Neural Networks 6, pp. 525–533, 1993.
  11. R. Battiti, "One step secant conjugate gradient," Neural Computation 4, pp. 141–166, 1992.
  12. J.E. Dennis, R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Englewood Cliffs, NJ: Prentice-Hall, 1983.
  13. J. Kennedy, R.C. Eberhart, Y. Shi, Swarm Intelligence, Morgan Kaufmann, San Francisco, CA, 2001.
  14. G.K. Jha, P. Thulasiraman, R.K. Thulasiram, “PSO based neural network for time series forecasting,” In Proceedings of the IEEE International Joint Conference on Neural Networks, Atlanta, Georgia, USA, pp. 1422–1427 June 14–19, 2009.
  15. R. Fletcher, Practical Methods of Optimization, 2nd ed. John Wiley, Chichester, 1987.
  16. C. de Groot, D. Wurtz, “Analysis of univariate time series with connectionist nets: a case study of two classical examples,” Neurocomputing 3, pp. 177–192, 1991.
  17. J. Scott Armstrong, Combining Forecasts, Principles of Forecasting: A Handbook for Researchers and Practitioners; J. Scott Armstrong (ed.): Norwell, MA: Kluwer Academic Publishers, 2001.
  18. H. Demuth, M. Beale, M. Hagan, Neural Network Toolbox User's Guide, Natic, MA, the MathWorks, 2010.
  19. I. Trelea, "The particle swarm optimization algorithm: convergence analysis and parameter selection," Information Processing Letters 85, pp. 317–325, 2003.
  20. R.J. Hyndman, Time Series Data Library, URL: http://robjhyndman.com/TSDL/, January, 2010.
  21. V. Vapnik, Statistical Learning Theory, New York, Springer-Verlag, 1995.
  22. J.A.K. Suykens and J. Vandewalle, “Least squares support vector machines classifiers”, Neural Processing Letters, vol. 9, no. 3, pp. 293–300, 1999.
  23. Y. Fan, P. Li, Z. Song, “Dynamic least square support vector machine”, Proceedings of the 6th World Congress on Intelligent Control and Automation (WCICA), Dalian, China, pp. 4886-4889, June 21–23, 2006.
  24. Birge, B.: PSOt-A Particle Swarm Optimization Toolbox for use with Matlab, Proceedings of the IEEE Swarm Intelligence Symposium, pp. 182-186. Indianapolis, Indiana, USA, 2003.
  25. K.W. Hipel, A.I. McLeod, Time Series Modelling of Water Resources and Environmental Systems, Amsterdam, Elsevier, 1994.
  26. C. Hamzacebi, “Improving artificial neural networks performance in seasonal time series forecasting,” Information Sciences 178, pp. 4550–4559, 2008.
Index Terms

Computer Science
Information Sciences

Keywords

Time Series Forecasting Artificial Neural Network Ensemble Backpropagation Training Algorithm ARIMA Support Vector Machine