We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Dimensionality Reduction and Classification through PCA and LDA

by Telgaonkar Archana H., Deshmukh Sachin
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 122 - Number 17
Year of Publication: 2015
Authors: Telgaonkar Archana H., Deshmukh Sachin
10.5120/21790-5104

Telgaonkar Archana H., Deshmukh Sachin . Dimensionality Reduction and Classification through PCA and LDA. International Journal of Computer Applications. 122, 17 ( July 2015), 4-8. DOI=10.5120/21790-5104

@article{ 10.5120/21790-5104,
author = { Telgaonkar Archana H., Deshmukh Sachin },
title = { Dimensionality Reduction and Classification through PCA and LDA },
journal = { International Journal of Computer Applications },
issue_date = { July 2015 },
volume = { 122 },
number = { 17 },
month = { July },
year = { 2015 },
issn = { 0975-8887 },
pages = { 4-8 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume122/number17/21790-5104/ },
doi = { 10.5120/21790-5104 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:10:47.434893+05:30
%A Telgaonkar Archana H.
%A Deshmukh Sachin
%T Dimensionality Reduction and Classification through PCA and LDA
%J International Journal of Computer Applications
%@ 0975-8887
%V 122
%N 17
%P 4-8
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Information explosion has occurred in most of the sciences and researches due to advances in data collection and storage capacity in last few decades. Advance datasets with large number of observations present new challenges in data, mining, analysis and classification. Traditional statistical method breaks down partly because of the increase in the number of variables associated with each observation which is known as high dimensional data. Much of the data is highly redundant which can be ignored to extract features of dataset. The process of mapping of high dimensional data to lower dimensional space in such a way to discard uninformative variance from the dataset or finding subspace in which data can be easily detected is known as Dimensionality Reduction. In this paper, well known techniques of Dimensionality Reduction namely Principle Component Analysis (PCA) and Linear Discriminant Analysis (LDA) are studied. Performance analysis is carried out on high dimensional data set UMIST, COIL and YALE which consists of images of objects and human faces. Classify the objects using knn classifier and naive bayes classifier to compare performance of these techniques. Difference between supervised and unsupervised learning is also inferred using these results.

References
  1. Y. pang, Y. Yuan, and X. Li. 2008 Effective feature extraction in high dimensional space, IEEE Trans. Syst. , Man, Cybern. B, Cybern.
  2. Y. Yang, F. Wu, D. Xu, Y. Zhuang, and L. -T. Chia. 2010 Cross-media retrieval using query dependent search methods, Pattern Recognition.
  3. R. Dudoit, J. Fridly, and T. P. Speed . 2002 Comparison of discrimination methods for the classification of tumors using gene expression data, J. Amer. Stat. Assoc.
  4. S. Xiang, F. Nie, C. Zhang, and C. Zhang. 2009 Interactive natural image segmentation via spline regression, IEEE Trans. Image Process.
  5. M. Turk, A. Pentland. 1991 Face recognition using Eigenfaces, Computer Vision and Pattern Recognition, Proceedings CVPR'91, IEEE Computer Society Conference on (1991).
  6. P. N. Belhumeur, J. P. Hespanha, D. J. Kriegman. 1997 Eigenfaces vs. fisherfaces: recognition using class specific linear projection, IEEE Trans. Pattern Anal. Mach. Intell.
  7. Feiping Nie, Dong Xu, Xuelong Li. 2011 Semi supervised Dimensionality Reduction and Classification through Virtual Label Regression.
  8. Lindsay I Smith. 2002A tutorial on Principal Components Analysis.
  9. Creative Commons Attribution-Share Alike 3. 0 Unported http:/ / creativecommons. org/ licenses/ by-sa/ 3. 0/
  10. Anil K. Ghosh. 2005 On optimum choice of k in nearest neighbor classification, Theoretical Statistics and Mathematics Unit, Indian Statistical Institute.
  11. Shakhnarrovish, Darrell and Indyk. 2005 Nearest Neighbor Methods in learning and vision, MIT-press.
  12. D. B. Graham and N. M. Allinson. 1998 Characterizing virtual eigensignatures for general purpose face recognition, in Face Recognition: From Theory to Applications
  13. S. A. Nene, S. K. Nayar, and H. Murase. 1996 Columbia object image library (COIL-20), Columbia Univ. , New York, Tech. Rep. CUCS-005-96
  14. A. Georghiades, P. Belhumeur, and D. Kriegman. 2001 From few too many: Illumination cone models for face recognition under variable lighting and pose, IEEE Trans. Pattern Anal. Mach. Intell.
Index Terms

Computer Science
Information Sciences

Keywords

Classification Dimensionality reduction KNN LDA PCA naïve bayes.