International Journal of Computer Applications |
Foundation of Computer Science (FCS), NY, USA |
Volume 65 - Number 23 |
Year of Publication: 2013 |
Authors: Deepak Kanojia, Mahak Motwani |
10.5120/11228-6545 |
Deepak Kanojia, Mahak Motwani . Comparison of Naive Basian and K-NN Classifier. International Journal of Computer Applications. 65, 23 ( March 2013), 40-45. DOI=10.5120/11228-6545
In this paper comparison is done between k-nearest neighbor and naïve basin classifier based on the subset of features. Sequential feature selection method is used to establish the subsets. Four categories of subsets are used like life and medical transcripts, arts and humanities transcripts, social science transcripts, physical science transcripts to show the experimental results to classify data and to show that K-NN classifier gets competition with naïve basian classifier. The classification performance K-NN classifier is far better then naïve basian classifier when learning parameters and number of samples are small. But as the number of samples increases the naïve basian classifier performance is better K-NN classifier. On the other hand naïve basian classifier is much better then K-NN classifier when computational demand and memory requirements are considered. This paper demonstrates the strength of naïve basian classifier for classification and summarizes the some of the most important developments in naïve basian classification and K- nearest neighbor classification research. Specifically, the issues of posterior probability estimation, the link between Naïve basian and K-NN classifiers, learning and generalization tradeoff in classification, the feature variable selection, as well as the effect of misclassification costs are examined. The purpose is to provide a synthesis of the published research in this area and stimulate further research interests and efforts in the identified topics.