CFP last date
20 January 2025
Reseach Article

Introduction of a new Metric Hit Rate and it's Variation with Scaling on Classification Algorithms

by Swapnil Ahuja
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 125 - Number 12
Year of Publication: 2015
Authors: Swapnil Ahuja
10.5120/ijca2015906148

Swapnil Ahuja . Introduction of a new Metric Hit Rate and it's Variation with Scaling on Classification Algorithms. International Journal of Computer Applications. 125, 12 ( September 2015), 13-16. DOI=10.5120/ijca2015906148

@article{ 10.5120/ijca2015906148,
author = { Swapnil Ahuja },
title = { Introduction of a new Metric Hit Rate and it's Variation with Scaling on Classification Algorithms },
journal = { International Journal of Computer Applications },
issue_date = { September 2015 },
volume = { 125 },
number = { 12 },
month = { September },
year = { 2015 },
issn = { 0975-8887 },
pages = { 13-16 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume125/number12/22483-2015906148/ },
doi = { 10.5120/ijca2015906148 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:15:50.854555+05:30
%A Swapnil Ahuja
%T Introduction of a new Metric Hit Rate and it's Variation with Scaling on Classification Algorithms
%J International Journal of Computer Applications
%@ 0975-8887
%V 125
%N 12
%P 13-16
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

This paper aims to introduce a new metric Hit Rate and how it is effected by the introduction of Scaling and also how does scaling effects accuracy of different algorithms and is not always beneficial. To reach our results we have used Python's Machine Learning Library Scikit-learn which is widely popular and to further validate our findings we have taken to completely different datasets from UCI Machine Learning repository.

References
  1. S. B. Kotsiantis, I. D. Zaharakis, and P. E. Pintelas, “Machine learning: A review of classification and combining techniques,”Artificial Intelligence Review, vol. 26, no. 3, pp. 159–190,2006.
  2. E. Alpaydin, Introduction to machine learning, 2010.
  3. F. Pedregosa, G. Varoquax, A. Gramfort, V. Michel, B. Thirion,O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, and M. Brucher, “Scikit-learn: Machine Learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
  4. T. Cover and P. Hart, “Nearest neighbor pattern classification,”IEEE Transactions on Information Theory, vol. 13, no. 1, pp.21–27, 1967.
  5. J. M. Keller and M. R. Gray, “A fuzzy K-nearest neighbor algorithm,”IEEE transactions on systems, man, and cybernetics,vol. smc-15, no. 4, july/august 1985, no. 4, pp. 580–585, 1985.
  6. S. A. Nene and S. K. Nayar, “A Simple Algorithm for Nearest Neighbor Search in High Dimensions,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 9, pp.989–1003, 1997.
  7. S. K. Murthy, “Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey,” (A2) Data Mining and Knowledge Discovery, vol. 2, pp. 345–389, 1998.
  8. S. R. Safavian and D. Landgrebe, “A survey of decision tree classifier methodology,” IEEE Transactions on Systems, Manand Cybernetics, vol. 21, no. 3, pp. 660–674, 1991.
  9. L. Breiman, “Random forests,” Machine Learning, vol. 45,no. 1, pp. 5–32, 2001.
  10. T. Shi and S. Horvath, “Unsupervised Learning With Random Forest Predictors,” Journal of Computational and Graphical Statistics, vol. 15, no. 1, pp. 118–138, 2006.
  11. M. Lichman, “UCI machine learning repository,” 2013.[Online]. Available: http://archive.ics.uci.edu/ml.
Index Terms

Computer Science
Information Sciences

Keywords

Hit Rate KNN Scikit