International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 76 - Number 16 |
Year of Publication: 2013 |
Authors: Manisha Singh, Somesh Kumar |
10.5120/13333-0916 |
Manisha Singh, Somesh Kumar . Exploring optimal architecture of Multi-layered Feed- forward (MLFNN) as Bidirectional Associative Memory (BAM) for Function Approximation. International Journal of Computer Applications. 76, 16 ( August 2013), 28-32. DOI=10.5120/13333-0916
Function approximation is an instance of supervised learning which is one of the most studied topics in machine learning, artificial neural networks, pattern recognition, and statistical curve fitting. In principle, any of the methods studied in these fields can be used in reinforcement learning. Multi-layered feed-forward neural networks (MLFNN) have been extensively used for the purpose of function approximation. Another class of neural networks, BAM, has also been studied and experimented for pattern mapping problems and many variations have been reported in literature. In the present study the application of back propagation algorithm to MLFNN has been proposed in such a way that feed-forward architecture behaves like BAM. Various architectures consisting of four-layers have been explored in quest of finding the optimal architecture for the example function.