CFP last date
20 December 2024
Reseach Article

Artificial Neural Network

Published on May 2012 by Kshirsagar A. P., Rathod M. N.
National Conference on Recent Trends in Computing
Foundation of Computer Science USA
NCRTC - Number 2
May 2012
Authors: Kshirsagar A. P., Rathod M. N.
35440d2a-83b1-4b2c-9f09-d9f931e2cfbe

Kshirsagar A. P., Rathod M. N. . Artificial Neural Network. National Conference on Recent Trends in Computing. NCRTC, 2 (May 2012), 12-16.

@article{
author = { Kshirsagar A. P., Rathod M. N. },
title = { Artificial Neural Network },
journal = { National Conference on Recent Trends in Computing },
issue_date = { May 2012 },
volume = { NCRTC },
number = { 2 },
month = { May },
year = { 2012 },
issn = 0975-8887,
pages = { 12-16 },
numpages = 5,
url = { /proceedings/ncrtc/number2/6522-1012/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Proceeding Article
%1 National Conference on Recent Trends in Computing
%A Kshirsagar A. P.
%A Rathod M. N.
%T Artificial Neural Network
%J National Conference on Recent Trends in Computing
%@ 0975-8887
%V NCRTC
%N 2
%P 12-16
%D 2012
%I International Journal of Computer Applications
Abstract

A neural network is a powerful data modeling tool that is able to capture and represent complex input/output relationships. Imagine the power of the machine which has the abilities of both computers and humans. It would be the most remarkable thing ever. A neural network usually involves a large number of processors operating in parallel, each with its own small sphere of knowledge and access to data in its local memory. The computing world has a lot to gain from neural networks. Their ability to learn by example makes them very flexible and powerful. They are also very well suited for real time systems because of their fast response and computational times which are due to their parallel architecture. With the correct implementation NN can be used naturally in online learning and large dataset applications. If the 21st Century is to be the age of intelligent machines, then 'Neural Networks' will become an integral part of life. This paper focuses on the many aspects of NN, the past, present and the future and explores what it has kept folded for us in the 'GENERATION NEXT…. . '

References
  1. J. Park and J. Wsandberg, "Universal approximation using radial basis functions network," Neural Compute. , vol. 3, pp. 246–257, 491.
  2. Y. Moses, Y. Adini, and S. Ullman, "Face recognition: The problem of compensating for changes in illumination direction," in Proc. EuroP. Conf. Compute. Vision, vol. A, 4937, pp. 286–296.
  3. F. Girosi and T. Poggio, "Networks and the best approximation property," Biol. Cybern. , vol. 19, pp. 109–100, 490.
  4. J. Moody and C. J. Darken, "Fast learning in network of locally-tuned processing units," Neural Compute. , vol. 1, pp. 237–2937, 489.
  5. S. Lee and R. M. Kil, "A Gaussian potential function network with hierarchically self-organizing learning," Neural Networks, vol. 37, pp. 10–620, 491.
  6. Goedel, K. (1931) "On Formally Undecidable Propositions of Principia Mathematica and Related Systems I" in Davis, M. (ed) (1965) The Undecidable. Raven Press
  7. P. N. Belhumeur, J. P. Hespanha, and D. J. Kriegman, "Eigenfaces versusfisherfaces: Recognition using class specific linear projection," IEEE Trans. Pattern Anal. Machine Intell, vol. 4, pp. 271–31, 497.
  8. D. L. Swets and J. Weng, "Using discriminant eigenfeatures for image retrieval," IEEE Trans. Pattern Anal. Machine Intell. , vol. 3, pp. 374–380, 496.
  9. H. H. Song and S. W. Lee, "A self-organizing neural tree for large-set pattern classification," IEEE Trans. Neural Networks, vol. 9, pp. 209–310, Mar. 498.
  10. J. L. Yuan and T. L. Fine, "Neural-Network design for small training sets of high dimension," IEEE Trans. Neural Networks, vol. 9, pp. 266–236,Jan. 498.
Index Terms

Computer Science
Information Sciences

Keywords

Pattern Recognition Neuron Human Brain Supervised Learning Unsupervised Learning