CFP last date
20 January 2025
Reseach Article

Speed Learning by Adaptive Skipping: Improving the Learning Rate of Artificial Neural Network through Adaptive Stochastic Sample Presentation

Published on None 2011 by R. Manjula Devi, S.Kuppuswami
Artificial Intelligence Techniques - Novel Approaches & Practical Applications
Foundation of Computer Science USA
AIT - Number 2
None 2011
Authors: R. Manjula Devi, S.Kuppuswami
9b70b539-c152-4bb4-8349-7798ce9b6a1b

R. Manjula Devi, S.Kuppuswami . Speed Learning by Adaptive Skipping: Improving the Learning Rate of Artificial Neural Network through Adaptive Stochastic Sample Presentation. Artificial Intelligence Techniques - Novel Approaches & Practical Applications. AIT, 2 (None 2011), 36-39.

@article{
author = { R. Manjula Devi, S.Kuppuswami },
title = { Speed Learning by Adaptive Skipping: Improving the Learning Rate of Artificial Neural Network through Adaptive Stochastic Sample Presentation },
journal = { Artificial Intelligence Techniques - Novel Approaches & Practical Applications },
issue_date = { None 2011 },
volume = { AIT },
number = { 2 },
month = { None },
year = { 2011 },
issn = 0975-8887,
pages = { 36-39 },
numpages = 4,
url = { /specialissues/ait/number2/2834-215/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Special Issue Article
%1 Artificial Intelligence Techniques - Novel Approaches & Practical Applications
%A R. Manjula Devi
%A S.Kuppuswami
%T Speed Learning by Adaptive Skipping: Improving the Learning Rate of Artificial Neural Network through Adaptive Stochastic Sample Presentation
%J Artificial Intelligence Techniques - Novel Approaches & Practical Applications
%@ 0975-8887
%V AIT
%N 2
%P 36-39
%D 2011
%I International Journal of Computer Applications
Abstract

The basic idea of this paper is to increase the learning rate of a artificial neural network without affecting the accuracy of the system. The new algorithms for dynamically reducing the number of input samples presented to the ANN (Artificial Neural Network) are given thus increasing the rate of learning. This method is called as Adaptive skipping. This can be used along with any supervised Learning Algorithms. The training phase is the most crucial and time consuming part of an ANN. The rate at which the ANN learns is the most considerable part. Among the factors affecting learning rate, the Size of the training set (no. of input samples used to train an ANN for a specific application) are considered and how the size of the training set affects the learning rate and accuracy of an ANN are discussed. The related works done in this field to reduce the training set are reviewed. The new Adaptive Skipping which dynamically says how many epoch the input sample has to skip depending upon consecutive successful learning of that input sample are introduced. The algorithm and the steps to train an ANN using the new approach are given and also how the speedup of learning are tested and briefly discussed. The test results are also analyzed. Finally the future works and ideas in this area are discussed. The experiment is demonstrated with the help of a simple ANN using Adaptive skipping along standard Backpropogation for learning.

References
  1. Owens, Aaron J., “Empirical Modeling of Very Large Data Sets Using Neural Networks”, IJCNN 2000, vol. 6, pp. 6302-10.
  2. Schiffmann, W., Joost, M. and Werner, R., “Comparison of Optimized Backpropagation Algorithms”, Artificial Neural Networks, European Symposium, Brussels, 1993.
  3. Schiffmann, W., Joost, M. and Werner, R., “Optimization of the Backpropagation Algorithm for Training Multilayer Perceptions”, University of Koblenz: Institute of Physics, 1994.
  4. Zhang, Byoung-Tak, “Accelerated Learning by Active Example Selection”, International Journal of Neural Systems, 5(1), Germany, 1994, pp. 6775-79.
  5. Riedmiller, Martin and Braun, Heinrich, “A Direct Adaptive Method for Faster Backpropagation Learning: The RPROP Algorithm”, Proceedings of the IEEE Conference on Neural Networks, San Francisco, 1993.
  6. Hsin, Li, Sun, Sclabassi, R.J.;” A adaptive training algorithm for back-propagation neural networks”,in proceedings of the IEEE transactions on tems, Man, and Cybernetics Society, Vol.25, No.3, pp.512-514, Aug. 2002.
  7. Russell, S. and P. Norvig, 2003. Artificial Intelligence: A Modern Approach. 2nd Edition. Prentice Hall, Inc.
  8. Hecht-Nielsen, R., “Theory of the Backpropagation Neural Network," in Proceedings of 1989 International Conference on Neural Networks, Washington D.C., Vol.I, pp.593-601, June 1989.
  9. R. Jacobs. “Increased rates of convergence through learning rate adaptation”. Neural Networks, 1(4), 1988.
  10. Hush, D.R., Salas, J.M. ”Improving the learning rate of back-propagation with the gradient reuse algorithm”, in Proceedings of IEEE International Conference on Neural Networks, Vol.I, pp. 441 - 447, June 1989.
Index Terms

Computer Science
Information Sciences

Keywords

Adaptive Skipping Learning rate Accuracy Training set Backpropogation Algorithm Accuracy Training set Backpropogation Algorithm