We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 November 2024
Reseach Article

Exploring optimal architecture of Multi-layered Feed- forward (MLFNN) as Bidirectional Associative Memory (BAM) for Function Approximation

by Manisha Singh, Somesh Kumar
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 76 - Number 16
Year of Publication: 2013
Authors: Manisha Singh, Somesh Kumar
10.5120/13333-0916

Manisha Singh, Somesh Kumar . Exploring optimal architecture of Multi-layered Feed- forward (MLFNN) as Bidirectional Associative Memory (BAM) for Function Approximation. International Journal of Computer Applications. 76, 16 ( August 2013), 28-32. DOI=10.5120/13333-0916

@article{ 10.5120/13333-0916,
author = { Manisha Singh, Somesh Kumar },
title = { Exploring optimal architecture of Multi-layered Feed- forward (MLFNN) as Bidirectional Associative Memory (BAM) for Function Approximation },
journal = { International Journal of Computer Applications },
issue_date = { August 2013 },
volume = { 76 },
number = { 16 },
month = { August },
year = { 2013 },
issn = { 0975-8887 },
pages = { 28-32 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume76/number16/13333-0916/ },
doi = { 10.5120/13333-0916 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T21:48:36.351083+05:30
%A Manisha Singh
%A Somesh Kumar
%T Exploring optimal architecture of Multi-layered Feed- forward (MLFNN) as Bidirectional Associative Memory (BAM) for Function Approximation
%J International Journal of Computer Applications
%@ 0975-8887
%V 76
%N 16
%P 28-32
%D 2013
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Function approximation is an instance of supervised learning which is one of the most studied topics in machine learning, artificial neural networks, pattern recognition, and statistical curve fitting. In principle, any of the methods studied in these fields can be used in reinforcement learning. Multi-layered feed-forward neural networks (MLFNN) have been extensively used for the purpose of function approximation. Another class of neural networks, BAM, has also been studied and experimented for pattern mapping problems and many variations have been reported in literature. In the present study the application of back propagation algorithm to MLFNN has been proposed in such a way that feed-forward architecture behaves like BAM. Various architectures consisting of four-layers have been explored in quest of finding the optimal architecture for the example function.

References
  1. Rumelhan D. E. , Hinton G. E. , and Williams R. J. , "Learning Internal Representations by Error Propagation", Parallel Distributed Processing, Vol. I Foundations, MIT Press, Cambridge, MA, pages 318-364. 1986. Ding, W. and Marchionini, G. 1997 A Study on Video Browsing Strategies. Technical Report. University of Maryland at College Park.
  2. Tang C. Z. and Kwan H. K. , "Parameter effects on convergence speed and generalization capability of backpropagation algorithm". International Journal of Electronics, 74(1): 35-46, January 1993.
  3. Kosko B. , Neural Networks and Fuzzy Systems, Prentice-Hall. Inc. , New Jersey, 1992.
  4. Singh, M. and Kumar, S. , "Using Multi-layered Feed-forward Neural Network (MLFNN) Architecture as Bidirectional Associative Memory (BAM) for Function Approximation", IOSR Journal of Computer Engineering (IOSR_JCE), vol. 13, Issue 4, pp 34-38
  5. Pearlmutter B. A. , "Gradient calculations for dynamic recurrent neural networks: a survey". IEEE Trans. On Neural Networks, 6 (5):1212-1228, Sept. 1995. Forman, G. 2003. An extensive empirical study of feature selection metrics for text classification. J. Mach. Learn. Res. 3 (Mar. 2003), 1289-1305.
  6. Prokhorov D. V. , Feldkamp L. A. , and Tyukin I. Y. , "Adaptive behavior with fixed weights in RNN: an overview". Proceedings of International Joint Conference on Neural Networks, 2002, pages 201 8-2022.
  7. Kwan H. K. and Yan J. , "Second-order recurrent neural network for word sequence learning", Proceedings of 2001 International Symposium on Intelligent Multimedia, Video and Speech Processing, Hong Kong, May 2-4. 2001, pages 405-408.
  8. Tai H. -M. . Wu C. -H. , and Jong, T. -L. , 'High-order bidirectional associative memory': Elecrronics krlers, 25(21):1424-1425, 12th Oct. 1989.
  9. Jeng Y. -J. , Yeh C. -C. , and Chiueh T. D. , 'Exponential bidirectional associative memories", Electronics Letters, 26(11):717-718,24th May1990.
  10. Chung F. L. and Lee Tong, 'On fuzzy associative memory with multiple-rule storage capacity". IEEE Trans. on Fuzzy Systems, 4(3):375-384, August 1996.
  11. Wang T. . Zhuang X. , and Xing X. , 'Weight learning of bidirectional associative memories by global minimization", IEEE Trans. on Neural Networks, 3(6):1010-1018, Nov. 1992.
  12. Shanmukh K. and Venkatesh Y. V. , 'Generalized scheme for optimal learning in recurrent neural networks': IEE Proc: Vision, Image, and Signal Processing. 142(2):71-77, April 1995.
  13. Salih, I. Smith, S. H. , and Liu, D. , 'Design of bidirectional associative memories based on the perceptron training technique". Proceedings of IEEE International Symposium on Circuits and Systems, 1999, Vol. 5, pages 355-358.
  14. Kwan H. K. and Tsang P. C. , 'Multi -layer recursive neural network': Proceedings of Canadian Conference on Electrical and Computer Engineering, Montreal, Canada, September 17-20, 1989, Vol. 1. pages 428-431.
  15. Widrow B. and Winter R. , 'Neural Nets for Adaptive Filtering and Adaptive Pattern Recognition'', IEEE Computer Magazine, pages 25-39. March 1988.
  16. Chartier S. , Giguere G. and Langlois D. , "A new bidirectional heteroassociative memory encompassing correlational, competitive and topological properties", Neural Networks 22 (2009), pp 568-578, 2009
  17. Wang Y. F. , Cruz J. B. and Mulligan J. H. , "Two coding strategies for bidirectional associative memory", IEEE Trans. Neural Networks, vol. 1, no. 1, pp 81-92, 1990
  18. Wang Y. F. , Cruz J. B. and Mulligan J. H. , "Guaranteed recall of all training pairs for bidirectional associative memory", IEEE Trans. Neural Networks, vol. 2, no. 6, pp 559-567, 1991
  19. Kang H. , "Multilayer associative neural network (MANN's): Storage capacity versus perfect recall", IEEE Trans. Neural Networks, vol. 5, pp 812-822, 1994
  20. Wang Z. , "A bidirectional associative memory based on optimal linear associative memory", IEEE Trans. Neural Networks, vol. 45, pp 1171-1179, Oct. 1996
  21. Hassoun M. H. and Youssef A. M. , "A high performance recording algorithm for Hopfield model associative memories", Opt. Eng. , vol. 27, no. , pp 46-54, 1989
  22. Simpson P. K. , "Higher-ordered and interconnected bidirectional associative memories", IEEE Trans. Syst. Man. Cybern. , vol. 20, no. 3, pp 637-652, 1990
  23. Zhuang X. , Huang Y. and Chen S. S. , "Better learning for bidirectional associative memory", Neural Networks, vol. 6, no. 8, pp1131-1146, 1993
  24. Oh H. and Kothari S. C. , "Adaptation of the relaxation method for learning bidirectional associative memory", IEEE Trans. Neural Networks, vol. 5, pp 573-583, July 1994.
  25. Wang C. C. and Don H. S. , "An analysis of high-capacity discrete exponential BAM", IEEE Trans. Neural Networks, vol. 6, no. 2, pp 492-496, 1995.
Index Terms

Computer Science
Information Sciences

Keywords

Neural networks Multilayered feed-forward neural network (MLFNN) Bidirectional Associative Memory (BAM) function approximation