CFP last date
20 December 2024
Reseach Article

A new Hierarchical Pattern Recognition method using Mirroring Neural Networks

by Dasika Ratna Deepthi, K. Eswaran
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 1 - Number 12
Year of Publication: 2010
Authors: Dasika Ratna Deepthi, K. Eswaran
10.5120/252-409

Dasika Ratna Deepthi, K. Eswaran . A new Hierarchical Pattern Recognition method using Mirroring Neural Networks. International Journal of Computer Applications. 1, 12 ( February 2010), 88-96. DOI=10.5120/252-409

@article{ 10.5120/252-409,
author = { Dasika Ratna Deepthi, K. Eswaran },
title = { A new Hierarchical Pattern Recognition method using Mirroring Neural Networks },
journal = { International Journal of Computer Applications },
issue_date = { February 2010 },
volume = { 1 },
number = { 12 },
month = { February },
year = { 2010 },
issn = { 0975-8887 },
pages = { 88-96 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume1/number12/252-409/ },
doi = { 10.5120/252-409 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T19:46:20.833172+05:30
%A Dasika Ratna Deepthi
%A K. Eswaran
%T A new Hierarchical Pattern Recognition method using Mirroring Neural Networks
%J International Journal of Computer Applications
%@ 0975-8887
%V 1
%N 12
%P 88-96
%D 2010
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this paper, we develop a hierarchical classifier (an inverted tree-like structure) consisting of an organized set of "blocks" each of which is actually a module that performs a feature extraction and an associated classification. We build each of such blocks by coupling a Mirroring Neural Network (MNN) with a clustering (algorithm) wherein the functions of the MNN are automatic data reduction and feature extraction which precedes an unsupervised classification. We then device an algorithm which we name as a "Tandem Algorithm" for the self-supervised learning of the MNN and an ensuing process of unsupervised pattern classification so that an ensemble of samples presented to the hierarchical classifier is classified and then sub-classified automatically. This tandem process is a two step process (feature extraction/data reduction and classification), implemented at each block (module) and can be extended level by level in the hierarchical architecture. The proposed procedure is practically demonstrated using 2 example cases where in a collage of images consisting of faces, flowers and furniture are classified and sub classified automatically.

References
  1. Buchanan, B.G. 1994. The role of experimentation in A.I. In Phil. Trans. R. Soc. A, 349, 153-166.
  2. Zucker, J.D. 2003. A grounded theory of abstraction in artificial intelligence. In Phil. Trans. R. Soc. B, 358, 193-1309.
  3. Holte, R.C. & Choueiry, B.Y. 2003. Abstraction & reformulation in A.I. In Phil. Trans. Roy. Soc.B, 358, 1197-1204.
  4. Cruze, H., Durr, V. & Schmitz, J. 2007. Insect walking is based on a deetralized architecture revealing a simple and robust controller. In Phil. Tran. R. Soc. A, 365, 221-250.
  5. Freund, Y. & Schapire, R.E. 1996. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pp. 148-156.
  6. Viola. P, & Jones, M. 2001. Rapid object detection using a boosted cascade of simple features. In Proc. IEEE Computer Society Conference on Computer Vision and Pattern Recognition 1, I-511-I-518 (2001).
  7. Hinton, G.E. & Salakhutdinov, R.R 2006. Reducing the Dimensionality of Data with Neural Networks. In Science 313, 504-507.
  8. Law, H.C. 2006. Clustering, Dimensionality Reduction and Side Information, Ph. D. Thesis, Michigan State University.
  9. Joachims, T. 1998. Text categorization with support vector machines: learning with many relevant features. In Proc. 10th European Conference on Machine Learning, 137-142.
  10. Craven, M., DiPasquo, D., Freitag, D., McCallum, A.K. & Mitchell, T.M. 2000. Learning to construct knowledge bases from the World Wide Web. In Artificial Intelligence 118, 69-113.
  11. Garcia, C. & Delakis, M. 2004. Convolutional face finder: A neural architecture for fast and robust face detection. In IEEE Trans. Pattern Anal. Mach. Intell. 26, 1408-1423.
  12. Phung, S.M. & Bouzerdoum, A. 2007. A Pyramidal Neural Network For Visual Pattern Recognition. In IEEE Transactions on Neural Networks 18, 329-343.
  13. Rosenblum, M., Yacoob, Y. & Davis, L.S. 1996. Human expression recognition from motion using a radial basis function network architecture. In IEEE Trans. Neural Networks 7, 1121-1138.
  14. Baldi, P. & Harnik, K. 1989. Neural networks and principal component analysis:learning from examples without local minima. In Neural Networks 2, 53-58.
  15. DeMers, D. & Cottrell, G. 1993. Non-linear dimensionality reduction. In Advances in Neural Information Processing Systems 5, Morgan Kaufmann, 580-587.
  16. Hopfield, J.J & Brody, C.D. 2004. Learning rules and network repair in spike-timing-based computation networks. In Proc. Natl. Acad. Sci. U. S. A. 101, 337-342.
  17. Lau, B., Stanley, G.B. & Dan, Y. 2002. Computational subunits of visual cortical neurons revealed by artificial neural networks. In Proc. Nat. Acad. Sci. USA 99, 8974-79.
  18. Deepthi, D.R., Kuchibhotla, S. & Eswaran, K. 2007. Dimensionality reduction and reconstruction using mirroring neural networks and object recognition based on reduced dimension characteristic vector. In IEEE International Conference on Advances in Computer Vision and Information Technology (IEEE, ACVIT-07), 348-353.
  19. Kolmogorov,A.N. 1957. On the representation of continuous functions of several variables by superposition of continuous functions of one variable and addition. In Doklady Akademia Nauk SSSR 114(5), 953-956.
  20. Deepthi, D.R. 2009. Automatic pattern recognition for applications in image processing and robotics, Ph. D. Thesis, Osmania University, Hyderabad, India.
  21. Creutzfeldt, D.O. 1977. Generality of the functional structure of the Neocortex. In Naturwissenschaften 64, 507-517.
  22. Mountcastle, B.V. 1978. An organizing principle for cerebral function: The unit model and the distributed system. In The Mindful Brain, Edelman,G.M, and V.B. Mountcastle,V.B. Eds., Cambridge, Mass.: MIT Press.
  23. Felleman, D.J. & Van Essen, D.C. 1991. Distributed hierarchical processing in the primate cerebral cortex. In Cerebral Cortex 1, 1-47.
  24. Rao, R.P. & Ballard, D.H. 1999. Predictive coding in the visual cortex: A functional interpretation of some extra-classical-receptive-field effects. In Nature Neuroscience 2, 79-87.
  25. Sherman, S.M. & Guillery, R.W. 2002. The role of the thalamus in the ow of information to the cortex. In Phil. Trans. Roy. Soc. London 357, 1695-708.
  26. Kaiser, M. 2007. Brain architecture: A design for natural computation. In Phil. Trans. Roy. Soc. A, 365, 3033-3045.
  27. Buzsaki, G., Geisler, C., Henze, D.A. & Wang, X.J. 2004. Interneuron diversity series: Circuit complexity and axon wiring economy of cortical interneurons. In Trends Neurosci.27, 186-193.
  28. Johnsen, J. D., Santhakumar, V., Morgan, R. J., Huerta, R., Tsimring, L., and Soltesz, I., 2007. Topological determinants of epileptogenesis in large-scale structural and functional models of the dentate gyrus derived from experimental data. In J. Neuro-physiol. 97, 1566-1587.
  29. Van Essen, D.C., Anderson,C.H., & Felleman, D.J. 1992. Information processing in the primate visual system: an integrated systems perspective. In Science 255, 419-423.
  30. Hawkins, J. 2005. On intelligence, Owl Books, Henry Holt & Co., New York, pp. 110-125.
  31. Bell, B.G. 1994. Levels & loops: the future of artificial intelligence & neuroscience. In Phil. Trans. R. Soc. B, 354, 2013-2030.
  32. Hawkins, J. & George, D. 2007. Hierarchical Temporal Memory, Concepts, Theory, and Terminology. In Numenta (Numenta Inc), pp. 1-20 www.numenta.com.
  33. George, D. 2008 How the brain might work: A hierarchical and temporal model for learning and recognition. Ph. D. Thesis, Stanford University.
  34. Herrero, J., Valencia, A. & Dopazo, J. 2001. A hierarchical unsupervised growing neural network for clustering gene expression patterns. In Bioinformatics 17, 126-136.
  35. Alex, L.P.T., Jacek, M.Z., Lai-Ping, W. & Xu, J. 2007. The hierarchical fast Learning artificial neural network (HieFLANN) An autonomous platform for hierarchical neural network construction. In IEEE Trans. Neural Networks 18, 1645-1657.
  36. FERET database:www.frvt.org/FERET/.
  37. MANCHESTER database: www.ecse.rpi.edu/ cvrl/database/.
  38. JAFFE database:www.kasrl.org/jaffe.html
  39. Gose, E., Johnsonbaugh, R. & Jost, S. 2000. Pattern Recognition and Image Analysis, Prentice Hall of India, New Delhi, pp 211-213.
  40. Deepthi, D.R., Krishna, G.R.A. & Eswaran, K. 2007. Automatic pattern classification by unsupervised learning using dimensionality reduction of data with mirroring neural networks. In IEEE International Conference on Advances in Computer Vision and Information Technology (IEEE, ACVIT-07), 354-360.
  41. Rumelhart, D.E., Hinton, G.E. & Williams, R.J. 1986. Learning Representations by back-propagating Errors. In Nature 323, 533-536
  42. Widrow, B., & Lehr, M.A. 1990. 30 Years of Adaptive Neural Networks: Perceptron, Madaline, and Backpropagation. In Proceedings of the IEEE 78 (9)
  43. Deepthi, D.R. & Eswaran, K. 2009. Pattern recognition and memory mapping using mirroring neural networks. In IEEE International Conference on Emerging Trends in Computing (IEEE, ICETiC 2009),India, 317-321.
Index Terms

Computer Science
Information Sciences

Keywords

Hierarchical Pattern Recognition classifier feature extraction Mirroring Neural Networks unsupervised classification Tandem Algorithm self-supervised learning