CFP last date
20 January 2025
Reseach Article

An Appraise of KNN to the Perfection

by Pooja Rani, Jyoti Vashishtha
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 170 - Number 2
Year of Publication: 2017
Authors: Pooja Rani, Jyoti Vashishtha
10.5120/ijca2017914696

Pooja Rani, Jyoti Vashishtha . An Appraise of KNN to the Perfection. International Journal of Computer Applications. 170, 2 ( Jul 2017), 13-17. DOI=10.5120/ijca2017914696

@article{ 10.5120/ijca2017914696,
author = { Pooja Rani, Jyoti Vashishtha },
title = { An Appraise of KNN to the Perfection },
journal = { International Journal of Computer Applications },
issue_date = { Jul 2017 },
volume = { 170 },
number = { 2 },
month = { Jul },
year = { 2017 },
issn = { 0975-8887 },
pages = { 13-17 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume170/number2/28041-2017914696/ },
doi = { 10.5120/ijca2017914696 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:17:23.807417+05:30
%A Pooja Rani
%A Jyoti Vashishtha
%T An Appraise of KNN to the Perfection
%J International Journal of Computer Applications
%@ 0975-8887
%V 170
%N 2
%P 13-17
%D 2017
%I Foundation of Computer Science (FCS), NY, USA
Abstract

K-Nearest Neighbor (KNN) is highly efficient classification algorithm due to its key features like: very easy to use, requires low training time, robust to noisy training data, easy to implement. However, it also has some shortcomings like high computational complexity, large memory requirement for large training datasets, curse of dimensionality and equal weights given to all attributes. Many researchers have suggested various advancements and improvements in KNN to overcome these shortcomings. This paper appraising various advancements and improvements in KNN.

References
  1. N. Padhy 2012. “The Survey of Data Mining Applications and Feature Scope,” Int. J. Comput. Sci. Eng. Inf. Technol., vol. 2, no. 3, pp. 43–58, Jun.
  2. J. Han and M. Kamber 2006. Data mining: concepts and techniques, 2nd ed. Amsterdam ; Boston : San Francisco, CA: Elsevier.
  3. S. Bagga and G. N. Singh 2012. “Applications of Data Mining,” Int. J. Sci. Emerg. Technol. Latest Trends, vol. 1, no. 1, pp. 19–23.
  4. S. Sethi, D. Malhotra, and N. Verma . 2006. “Data Mining: Current Applications & Trends,” Int. J. Innov. Eng. Technol. IJIET, vol. 6, no. 14, pp. 667–673, Apr.
  5. M. E. Syed .2014. “Attribute weighting in k-nearest neighbor classification,” University of Tampere.
  6. L. Jiang, H. Zhang, and Z. Cai. 2006. “Dynamic k-nearest-neighbor naive bayes with attribute weighted,” in International Conference on Fuzzy Systems and Knowledge Discovery, 2006, pp. 365–368.
  7. I. H. Witten and E. Frank. 2005. Data mining: practical machine learning tools and techniques, 2nd ed. Amsterdam ; Boston, MA: Elsevier.
  8. S. B. Imandoust and M. Bolandraftar. 2013. “Application of k-nearest neighbor (knn) approach for predicting economic events: Theoretical background,” Int. J. Eng. Res. Appl., vol. 3, no. 5, pp. 605–610.
  9. R. Kumar and R. Verma. 2012. “Classification algorithms for data mining: A survey,” Int. J. Innov. Eng. Technol. IJIET, vol. 1, no. 2, pp. 7–14.
  10. L. Jiang, Z. Cai, D. Wang, and S. Jiang. 2007. “Survey of improving k-nearest-neighbor for classification,” presented at the Fourth International Conference on Fuzzy Systems and Knowledge Discovery, 2007, vol. 1, pp. 679–683.
  11. Liangxiao Jiang, Harry Zhang, and Zhihua Cai. ,2006. “Dynamic k-nearest-neighbor naive bayes with attribute weighted,” presented at the International Conference on Fuzzy Systems and Knowledge Discovery, 2006, pp. 365–368.
  12. D. P. Vivencio, E. R. Hruschka, M. do Carmo Nicoletti, E. B. dos Santos, and S. D. Galvao. 2007. “Feature-weighted k-nearest neighbor classifier,” presented at the Foundations of Computational Intelligence,2007, pp. 481–486.
  13. W. Baobao, M. Jinsheng, and S. Minru. 2008. “An enhancement of K-Nearest Neighbor algorithm using information gain and extension relativity,” presented at the International Conference on Condition Monitoring andDiagnosis, 2008, pp. 1314–1317.
  14. X. Xiao and H. Ding. 2012. “Enhancement of K-nearest neighbor algorithm based on weighted entropy of attribute value,” presented at the Fourth International Conference on Advanced & Communication Technologies, 2012, pp. 1261–1264.
  15. Z. Li, Z. Chengjin, X. Qingyang, and L. Chunfa. 2015. “Weigted-KNN and its application on UCI,” presented at the International Conference on Information and Automation, 2015, pp. 1748–1750.
  16. X. Li and C. Xiang.2012. “Correlation-based K-nearest neighbor algorithm,” presented at the 3rd International Conference on Software Engineering and Service Science, 2012, pp. 185–187.
  17. K. Hechenbichler and K. Schliep. 2004. “Weighted k-nearest-neighbor techniques and ordinal classification,” Institue for statistik sonderforschungsbereich, 386.
  18. Maryam Kuhkan. 2016. “A Method to Improve the Accuracy of K-Nearest Neighbor Algorithm,” Int. J. Comput. Eng. Inf. Technol. IJCEIT, vol. 8, no. 6, pp. 90–95, Jun.
  19. J. Gou, L. Du, Y. Zhang, T. Xiong, and others. 2012. “A new distance-weighted k-nearest neighbor classifier,” J Inf Comput Sci, vol. 9, no. 6, pp. 1429–1436.
  20. J. Wu, Z. hua Cai, and S. Ao. 2012. “Hybrid dynamic k-nearest-neighbour and distance and attribute weighted method for classification,” Int. J. Comput. Appl. Technol., vol. 43, no. 4, pp. 378–384.
  21. S. Taneja, C. Gupta, K. Goyal, and D. Gureja. 2014. “An Enhanced K-Nearest Neighbor Algorithm Using Information Gain and Clustering,” presented at the Fourth International Conference on Advanced Computing & Communication Technologies, 2014, pp. 325–329.
Index Terms

Computer Science
Information Sciences

Keywords

K-Nearest Neighbor KNN Distance weighted KNN Attribute weighted KNN.