We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Principal Pattern Analysis: A Combined Approach for Dimensionality Reduction with Pattern Categorization

by T. Kalai Chelvi, P. Rangarajan
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 41 - Number 6
Year of Publication: 2012
Authors: T. Kalai Chelvi, P. Rangarajan
10.5120/5548-7616

T. Kalai Chelvi, P. Rangarajan . Principal Pattern Analysis: A Combined Approach for Dimensionality Reduction with Pattern Categorization. International Journal of Computer Applications. 41, 6 ( March 2012), 35-41. DOI=10.5120/5548-7616

@article{ 10.5120/5548-7616,
author = { T. Kalai Chelvi, P. Rangarajan },
title = { Principal Pattern Analysis: A Combined Approach for Dimensionality Reduction with Pattern Categorization },
journal = { International Journal of Computer Applications },
issue_date = { March 2012 },
volume = { 41 },
number = { 6 },
month = { March },
year = { 2012 },
issn = { 0975-8887 },
pages = { 35-41 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume41/number6/5548-7616/ },
doi = { 10.5120/5548-7616 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T20:29:26.748174+05:30
%A T. Kalai Chelvi
%A P. Rangarajan
%T Principal Pattern Analysis: A Combined Approach for Dimensionality Reduction with Pattern Categorization
%J International Journal of Computer Applications
%@ 0975-8887
%V 41
%N 6
%P 35-41
%D 2012
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Over the past decades there has been several techniques found to overcome the data analysis problem in most of the science domains such as engineering, astronomy, biology, remote sensing, economics, consumer transactions etc. , It is required to reduce the dimension of the data (having less features) in order to improve the efficiency and accuracy of data analysis. Traditional statistical methods partly calls off due to the increase in the number of observations, but mainly because of the increase in number of variables associated with each observation. As a consequence an ideal technique called Principal Pattern Analysis is developed which encapsulates feature extraction and categorize features. Initially it applies principal component analysis to extract eigen vectors similarly to prove pattern categorization theorem the corresponding patterns are segregated. Certain decisive factors as weight vectors are determined to categorize the patterns. Experimental results have been proved that error approximation rate is very less too it's more versatile for high dimensional datasets.

References
  1. Jin, H. , Ooi, B. C. , Shen, H. T. , Yu, C. , Zhou, A. Y. : An adaptive and efficient dimensionality reduction algorithm for high-dimensional indexing. In: Proc. ICDE. (2003)
  2. Jinbo Bi, Kristin P Bennett, Mark Embrechts, Curt M Breneman, Minghu: Dimensionality Reduction via Sparse Support Vector Machines in Journal of Machine Learning Research (2003)
  3. Ilin, A. and T. Raiko. " Practical approaches to principal component analysis in the presence of missing values" Journal of Machine Learning Research 11, 2010.
  4. Pearson, K. : On lines and planes of closest fit to systems of points in space. Philosophical Magazine 2(6), 559–572 (1901)
  5. Jolliffe, I. : Principal Component Analysis. Springer, Heidelberg (1986)
  6. Bishop, C. : Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)
  7. Diamantaras, K. , Kung, S. : Principal Component Neural Networks - Theory and Application. Wiley, Chichester (1996)
  8. Alexey Tsymbal, Seppo Puuronen, Mykola Pechenizkiy, Matthias Baumgarten, David Patterson "Eigenvector-based Feature Extraction for Classification"
  9. Haykin, S. : Modern Filters. Macmillan (1989)
  10. Cichocki, A. , Amari, S. : Adaptive Blind Signal and Image Processing – Learning Algorithms and Applications. Wiley, Chichester (2002)
  11. Oja, E. : Neural networks, principal components, and subspaces. International Journal of Neural Systems 1(1), 61–68 (1989)
  12. T. Raiko, A. Ilin, and J. Karhunen, "Principal component analysis for large scale problems with lots of missing values," in Proceedings of the 18th European Conference on Machine Learning (ECML 2007), Warsaw, Poland, September 2007.
  13. Tipping, M. , Bishop, C. : Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 61(3), 611–622 (1999)
  14. M. West. Bayesian factor regression models in the "large p, small n" paradigm. Bayesian Statistics, 7:723{732, 2003.
  15. Grung, B. , Manne, R. : Missing values in principal components analysis. Chemometrics and Intelligent Laboratory Systems 42(1), 125–139 (1998).
  16. Bishop, C. : Variational principal components. In: Proc. 9th Int. Conf. on Artificial Neural Networks (ICANN99), pp. 509–514 (1999).
  17. Oba, S. , Sato, M. , Takemasa, I. , Monden, M. , Matsubara, K. , Ishii, S. : A Bayesian missing value estimation method for gene expression profile data. Bioinformatics 19(16), 2088–2096 (2003)
  18. Netflix: Netflix prize webpage (2007), http://www. netflixprize. com/
Index Terms

Computer Science
Information Sciences

Keywords

Principal Component Analysis Eigen Vectors Dimensionality Reduction