We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Multilayer Feed-Forward Neural Network Integrated with Dynamic Learning Algorithm by Pruning of Nodes and Connections

by Siddhaling Urolagin
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 47 - Number 2
Year of Publication: 2012
Authors: Siddhaling Urolagin
10.5120/7159-8191

Siddhaling Urolagin . Multilayer Feed-Forward Neural Network Integrated with Dynamic Learning Algorithm by Pruning of Nodes and Connections. International Journal of Computer Applications. 47, 2 ( June 2012), 7-17. DOI=10.5120/7159-8191

@article{ 10.5120/7159-8191,
author = { Siddhaling Urolagin },
title = { Multilayer Feed-Forward Neural Network Integrated with Dynamic Learning Algorithm by Pruning of Nodes and Connections },
journal = { International Journal of Computer Applications },
issue_date = { June 2012 },
volume = { 47 },
number = { 2 },
month = { June },
year = { 2012 },
issn = { 0975-8887 },
pages = { 7-17 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume47/number2/7159-8191/ },
doi = { 10.5120/7159-8191 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:40:51.140734+05:30
%A Siddhaling Urolagin
%T Multilayer Feed-Forward Neural Network Integrated with Dynamic Learning Algorithm by Pruning of Nodes and Connections
%J International Journal of Computer Applications
%@ 0975-8887
%V 47
%N 2
%P 7-17
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Neural networks have found many applications in the real world. One of the important issues while designing the neural network is the size of the architecture. Dynamic learning algorithms aim to determine appropriate size of the network during learning phase. The dynamic learning algorithm by pruning involves in removing networks elements such as nodes, weights or biases from the network to reduce its size and make network size appropriate to solve a problem. In this paper two dynamic learning by pruning methods have been integrated with multilayer feed-forward neural network. The Optimal Brain Damage method is the connections (weights or biases) pruning method and Bottom Up Freezing method involves in freezing and pruning of nodes. The experiments have been conducted on MNIST handwritten database. The learning behavior of the multilayer feed-forward neural network integrated with OBD and BUF method has been analyzed.

References
  1. Giovanna Castellano, 1997. Ann Maria Fanelli and MarcellocPelillo, An Iterative Pruning Algorithm for Feed forward Neural Networks, in IEEE Trans. on Neural Network, 8, (3),519-531.
  2. Russel Reed, 1993. Pruning Algorithms- A Survey, in IEEE Trans. on Neural Network, 4, (5), 740-747.
  3. Timothy Masters, 1993. Practical Neural Network Recipes in C++ Academic Press, Inc, Harcourt Brace & Company Publisher, Boston San Diego New York.
  4. E. B. Baum and D. Haussler, 1989. What size net gives valid generalization? inNeural Computation, 1, 151-160.
  5. J. Denker, D. Schwartz, B. Wittner, S. Solla, R. Howard, L. Jackel and J. Hopfield, 1987. Large automatic learning, rule extraction, and generalization, in Complex Systems, 1, 877-922.
  6. Y. LeCun, 1989. Generalization and network design strategies, in Connectionism in Perspective, R. Pfeifer, Z. Schreter, F. Fogelman-Soulie and L. Steels, Eds, Amsterdam: Elsevier, 143-155.
  7. Y. Le Cun, J. S. Denker, and S. A. Solla, 1990. Optima Brain Damage, in Advances in Neural Information Processing (2), D. S. Touretzky, Ed (Denver 1989), 598-605.
  8. B. Hassibi and D. G. Stork, 1993. Second-order derivatives for network pruning: Optimal Brain Surgeon, in Advances in Neural Information Processing Systems, S. J. Hanson, J. d. Cowan and C. L. Gileses San Mateo, CA: Morgan Kaufmann, 164-171.
  9. Sietsma, J. , and R. J. F. Dow, 1991. Creating artificial neural networks that generalize, in Neural Networks, vol. 4,(1), 67-79.
  10. Mozer, M. C. , and P. Smolensky,1989. Skeletonization: A technique for trimming the fat from a network via relevance assessment, in D. S. Touretzky, editor, Advances in Neural Information Processing Systems (Denver, 1988) (1), Morgan Kaufmann, San Mateo, 107-115.
  11. Ali Farzan and Ali A. Ghorbani, 2001. The Bottom-Up Freezing: An Approach to Neural Engineering, in Proceedings of Advances in Artificial Intelligence: 14th Biennial Conference of the Canadian Society for Computational Studies of Intelligence, AI, Ottawa, Canada, 317 – 324.
  12. Giorgio Corani, Giorgio Guariso, 2005. An application of pruning in the designof neural networks for real time flood forecasting, in Journal Neural Computing and Applications, 14, (1), 66-77.
  13. P. Galerne , K. Yao , G. Burel1998. New Neural Network Pruning and its application to sonar Imagery, in Conference IEEE-CESA'98, Hammamet, Tunisia, April 1-4.
  14. Kenji Suzuki, Isao Horiba, Noboru Sugie, 2001. A Simple Neural Network Pruning Algorithm with Application to Filter Synthesis, in Neural Processing Letters, 13, (1), 43-53.
  15. LeCun, Y. , 1987. Modelesconnexionnistes de l'apprentissage (connectionist learning models). PhD thesis, UnivesityP. et. M. Curie (Paris 6).
Index Terms

Computer Science
Information Sciences

Keywords

Pruning Dynamic Learning Freezing Neural Network