CFP last date
20 January 2025
Reseach Article

Movement Prediction using Reservoir Computing

by Ahmed Salam Abd Al Rasool, Sulaiman Murtadha Abbas
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 69 - Number 8
Year of Publication: 2013
Authors: Ahmed Salam Abd Al Rasool, Sulaiman Murtadha Abbas
10.5120/11862-7646

Ahmed Salam Abd Al Rasool, Sulaiman Murtadha Abbas . Movement Prediction using Reservoir Computing. International Journal of Computer Applications. 69, 8 ( May 2013), 17-33. DOI=10.5120/11862-7646

@article{ 10.5120/11862-7646,
author = { Ahmed Salam Abd Al Rasool, Sulaiman Murtadha Abbas },
title = { Movement Prediction using Reservoir Computing },
journal = { International Journal of Computer Applications },
issue_date = { May 2013 },
volume = { 69 },
number = { 8 },
month = { May },
year = { 2013 },
issn = { 0975-8887 },
pages = { 17-33 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume69/number8/11862-7646/ },
doi = { 10.5120/11862-7646 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T21:31:38.227956+05:30
%A Ahmed Salam Abd Al Rasool
%A Sulaiman Murtadha Abbas
%T Movement Prediction using Reservoir Computing
%J International Journal of Computer Applications
%@ 0975-8887
%V 69
%N 8
%P 17-33
%D 2013
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this work the Reservoir Computing (RC) technique; specifically the Liquid State Machine (LSM) was chosen to simulate a Movement Predictor system at minimum cost, experiencing both short and long term prediction. Also in this work shows the possibility to simulate the LSM without the need to event based simulation (i. e. proving that it is not urgent to interface the MATLAB programming language with other programming languages like C and C++ to simulate the LSM). The encoding from spiking to analog domain was avoided in this work. This means there is no waste in the input information due to the encoding process. Also this will result in simpler LSM scheme.

References
  1. H. Jaeger, "The echo state approach to analyzing and training recurrent neural networks," Technical Report GMD Report 148, German National Research Center for Information Technology, 2001.
  2. W. Maass, T. Natschlaeger and H. Markram. , "Real-time computing without stable states: A new framework for neural computation based on perturbations," Neural Computation, 14(11), pp. 2531-2560, 2002.
  3. T. Natschlager, W. Maass and H. Markram, "The 'Liquid computer': A Novel Strategy for Real-Time Computing on Time Series," Journal Article, Special Issue on Foundations of Information Processing of {TELEMATIK}, 2002.
  4. W. Maass and C. M. Bishop, Pulsed Neural Networks, MIT-press, 1999.
  5. W. Maass, "Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons," Advances in Neural Information Processing Systems, vol. 9, p. 211–217, 1997.
  6. A. S. Abdulrasool, A Study of Reservoir Computing: Echo State Network and Liquid State Machine, Bghdad University, Electrical Engineering Department, M. Sc. thesis, 2010.
  7. W. Maass, T. Natschlaeger, and H. Markram. , "Real-time computing without stable states: A new framework for neural computation based on perturbations," Neural Computation, 14(11), pp. 2531-2560, 2002.
  8. A. V. Holden, J. V. Tucker and B. C. Thompson, "Can excitable media be considered as computational systems?. ," Physica D, 1991.
  9. T. Natschlager, W. Maass and H. Markram, "The 'Liquid computer': A Novel Strategy for Real-Time Computing on Time Series," Journal Article, Special Issue on Foundations of Information Processing of {TELEMATIK}, 2002.
  10. J. M. Zurada, Introduction to Artificial Neural Systems, West Publishing Company, 1992.
  11. B. Schrauwen, Towards Applicable Spiking Neural Networks, Doctrine assertion, Gent University of Technology, 2008.
  12. W. Maass, "Noisy spiking neurons with temporal coding have more computational power than sigmoidal neurons," Advances in Neural Information Processing Systems, vol. 9, p. 211–217, 1997.
  13. W. Maass, "Networks of spiking neurons: the Third Generation of Neural Network Models," Neural Networks, vol. 10, no. 9, p. 1659–1671, December 1997.
  14. W. Maass, T. Natschlger and H. Markram, "Fading memory and kernel properties of generic cortical microcircuit models," Journal of Physiology, 98(4-6), pp. 315-330, 2004.
  15. W. Maass, "Lower bounds for the computational power of networks of spiking neuron," Neural Computation, 8(1), pp. 1-40, 1996.
  16. J. Triesch, "Synergies between intrinsic and synaptic plasticity mech-anisms," Neural Computation, 19, p. 885–909, 2007.
  17. W. Maass and C. M. Bishop, Pulsed Neural Networks, MIT-press, 1999.
  18. E. O. Dijk, Analysis of Recurrent Neural Networks with Application to Speaker Independent Phoneme Recognition, University of Twente, M. Sc thesis, 1999 .
  19. R. B. Randall and Jens Hee, "Cepstrum Analysis," Technical Review No. 3, 1981.
  20. A. Gupta, Y. Wang and H. Markram, "Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex," Science (New York, N. Y. ), Vol. 287, No. 5451, pp. 273-278, 14 January 2000.
  21. Chatterjee, S. and A. S. Hadi, "Influential Observations,High Leverage Points, and Outliers in Linear Regression," Statistical Science 1(3), pp. 379-416, 1986.
  22. A. Turing, Intelligent Machinery, C. E. a. A. Robertson, Ed. , Reprinted in "Cybernetics: Key Papers. ", 1948, p. 31.
  23. Draper N. and H. Smith, Applied Regression Analysis, 2nd ed. , Wiley, 1981.
  24. Lukoševi?ius, M. , and H. Jaeger, "Reservoir computing approaches to recurrent neural network training," Science Review 3 (3), pp. 127-149, 2009.
  25. Bernhard Scholkopf and Alex J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond, MIT press, 2002.
  26. Jaeger, H. , "Short term memory in echo state networks," Tech. rep. no. GMD report 152. German National Research Center for Information Technology, 2001.
  27. Jaeger, H. , "The echo state approach to analyzing and training recurrent neural networks," Technical Report GMD Report 148, German National Research Center for Information Technology, 2001.
  28. J. J. Steil, "Backpropagation-Decorrelation: online recurrent learning with O(N) complexity," in Neural Networks, Proceedings, IEEE International Joint Conference on, 2004.
  29. A. B. Atiya and A. G. Parlos, "New results on recurrent network training:Unifying the algorithms and accelerating convergence," IEEE Trans. Neural Networks, vol. 11, no. 9, pp. 697–-709, 2000.
  30. D. Verstraeten, B. Schrauwen, M. D'Haene and D. Stroobandt, "An experimental unification of reservoir computing methods," Neural Networks, 29, pp. 391-403, 2007.
  31. S. Thorpe, A. Delorme, V. Rullen and R. , "Spike based strategies for rapid processing," Neural Networks, vol. 14(6-7),, pp. 715-726, 2001.
Index Terms

Computer Science
Information Sciences

Keywords

Reservoir Computing Spiking Recurrent Neural Network (SRNN) Liquid State Machine (LSM) Prediction Spiking Neuron Time based simulation