International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 21 - Number 3 |
Year of Publication: 2011 |
Authors: Vikas Chaudhary, Dr. Anil K. Ahlawat, Dr. R.S. Bhatia |
10.5120/2495-3372 |
Vikas Chaudhary, Dr. Anil K. Ahlawat, Dr. R.S. Bhatia . Growing Neural Networks using Soft Competitive Learning. International Journal of Computer Applications. 21, 3 ( May 2011), 1-6. DOI=10.5120/2495-3372
This paper gives an overview of some classical Growing Neural Networks (GNN) using soft competitive learning. In soft competitive learning each input signal is characterized by adapting in addition to the winner also some other neurons of the network. The GNN is also called the ANN with incremental learning. The artificial neural networks (ANN) mapping capability depends on the number of layers and the number of hidden layers in the structure of ANN. There is no formal way of computing network structure. Network structure is usually selected by trial-and-error method but it is time consuming process. Basically, we make use of two mechanisms that may modify the structure of the network: growth and pruning. In this paper, the competitive learning is firstly introduced; secondly the SOM topology and limitations of SOM are illustrated. Thirdly, a class of classical GNN with soft competitive learning is reviewed, such as Neural Gas Network (NGN), Growing Neural Gas (GNG), Self-Organizing Surfaces (SOS), Incremental Grid Growing (lGG), Evolve Self-Organizing Maps (ESOM), Growing Hierarchical Self-Organizing Map (GHSOM), and Growing Cell Structures (GCS).