We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Handwritten Digit Recognition using Slope Detail Features

by A. M. Hafiz, G. M. Bhat
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 93 - Number 5
Year of Publication: 2014
Authors: A. M. Hafiz, G. M. Bhat
10.5120/16210-5512

A. M. Hafiz, G. M. Bhat . Handwritten Digit Recognition using Slope Detail Features. International Journal of Computer Applications. 93, 5 ( May 2014), 14-19. DOI=10.5120/16210-5512

@article{ 10.5120/16210-5512,
author = { A. M. Hafiz, G. M. Bhat },
title = { Handwritten Digit Recognition using Slope Detail Features },
journal = { International Journal of Computer Applications },
issue_date = { May 2014 },
volume = { 93 },
number = { 5 },
month = { May },
year = { 2014 },
issn = { 0975-8887 },
pages = { 14-19 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume93/number5/16210-5512/ },
doi = { 10.5120/16210-5512 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T22:15:01.217628+05:30
%A A. M. Hafiz
%A G. M. Bhat
%T Handwritten Digit Recognition using Slope Detail Features
%J International Journal of Computer Applications
%@ 0975-8887
%V 93
%N 5
%P 14-19
%D 2014
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In this paper, new features called Slope Detail (SD) features for handwritten digit recognition have been introduced. These features are based on shape analysis of the digit image and extract slant or slope information. They are effective in obtaining good recognition accuracies. When combined with commonly used features, Slope Detail features enhance the digit recognition accuracy. K- Nearest Neighbour (k-NN) and Support Vector Machine (SVM) algorithms have been used for classification purposes. The data sets used are the Semeion Data Set and United States Postal Service (USPS) Data Set. For the USPS Data Set an error rate of 1. 3% was obtained, which has been found to be better than any reported error rate on the said data set.

References
  1. R. Plamondon and S. N. Srihari, "On-Line and Off-Line Handwriting Recognition: A Comprehensive Survey," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, pp. 68-69, 2000.
  2. Q. F. Wang, F. Yin, and C. L. Liu, "Handwritten Chinese Text Recognition by Integrating Multiple Contexts," Trans. Pattern Analysis and Machine Intelligence, vol. 34, 2012.
  3. A. M. Namboodiri and A. K. Jain, "Online Handwritten Script Recognition," Trans. Pattern Analysis and Machine Intelligence, vol. 26, 2004.
  4. A. Vinciarelli, S. Bengio, and H. Bunke, "Offline Recognition of Unconstrained Handwritten Texts Using HMMs and Statistical Language Models," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, 2004.
  5. S. Marinai, M. Gori, and G. Soda, "Artificial Neural Networks for Document Analysis and Recognition," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, 2005.
  6. ed. Semeion Research Center of Sciences of Communication, via Sersale 117, 00128 Rome, Italy, Tattile Via Gaetano Donizetti, 1-3-5, 25030 Mairano (Brescia), Italy.
  7. S. Roweis. USPS Handwritten Digit Dataset [Online]. Available: http://www. cs. nyu. edu/~roweis/data. html
  8. A. Frank and A. Asuncion, ed. UCI Machine Learning Repository, School of Information and Computer Science, University of California, Irvine, CA, 2010.
  9. C. C. Chang, C. W. Hsu, and C. J. Lin. (2009, Practical Guide to Support Vector Classification.
  10. C. Cortes and V. Vapnik, "Support-vector network," Machine Learning, vol. 20, pp. 273-297, 1995.
  11. S. S. Keerthi and C. J. Lin, "Asymptotic behaviors of support vector machines with gaussian kernel," Neural Computation, 2003.
  12. C. -C. Chang and C. -J. Lin, "LIBSVM: A library for support vector machines," ACM Transactions on Intelligent Systems and Technology, vol. 2, pp. 1-27, 2011.
  13. T. M. Cover and P. E. Hart, "Nearest Neighbor Pattern Classification," IEEE Transactions on Information Theory, vol. 13, pp. 21-27, 1967.
  14. P. Cunningham and S. J. Delaney. (2007, k-Nearest Neighbour Classifiers.
  15. M. Weeks, V. Hodge, S. O'Keefe, J. Austin, and K. Lees, "Improved AURA k-nearest neighbour approach," in Proc. of the 7th International Work-Conference on Artificial and Natural Neural Networks, J. Mira and J. R. Alvarez, Eds. , ed, 2003.
  16. H. Liu and X. Ding, "Handwritten Character Recognition Using Gradient Feature and Quadratic Classifier with Multiple Discrimination Schemes," in 8th ICDAR, 2005, pp. 19-23.
  17. L. J. P. v. d. Maaten and G. E. Hinton, "Visualizing High-Dimensional Data Using t-SNE," Journal of Machine Learning Research, vol. 9, pp. 2579-2605, 2008.
  18. P. Simard, Y. LeCun, and J. Denker, "Tangent prop - a formalism for specifying selected invariance in an adaptive network," in Advances in Neural Information Processing Systems. vol. 4, J. E. Moody, S. J. Hanson, and R. P. Lippmann, Eds. , ed San Mateo, CA: Morgan Kaufmann, 1993.
  19. Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, et al. , "Backpropagation applied to handwritten zip code recognition," Neural Computation, vol. 1, pp. 541-551, 1989.
  20. B. E. Boser, I. M. Guyon, and V. N. Vapnik, "A training algorithm for optimal margin classifiers," in 5th Annual ACM Workshop on Computational Learning Theory, Pittsburg, PA, 1992.
  21. B. Scholkopf, C. J. C. Burges, and V. Vapnik, "Extracting support data for a given task," in First International Conference on Knowledge Discovery and Data Mining, 1995, pp. 252-257.
  22. T. Deselaers, T. Gass, G. Heigold, and H. Ney, "Latent Log-Linear Models for Handwritten Digit Classification," vol. 34, pp. 1105 - 1117, 2012.
  23. L. Bottou and V. Vapnik, "Local Learning algorithm," Neural Computation, vol. 4, pp. 888-901, 1992.
  24. B. Scholkopf, C. J. C. Burges, and V. Vapnik, "Incorporating invariances in support vector learning machines," in Artificial Neural Network, 1996, pp. 47-52.
  25. B. Scholkopf, "Support Vector Learning," Ph. D. , Technical University of Berlin, Munchen, 1997.
  26. B. Haasdonk, "Transformation Knowledge in Pattern Analysis with Kernel Methods," Ph. D Ph. D. , Albert-Ludwigs-Universitat, Freiburg, 2005.
  27. D. Keysers, T. Deselaers, C. Gollan, and H. Ney, "Deformation models for image recognition," IEEE Transaction on Pattern Analysis and Machine Intelligence, vol. 29, pp. 1422-1435, 2007.
  28. J. X. Dong, A. Krzyzak, and C. Y. Suen, "Fast SVM Training Algorithm with Decomposition on Very Large Datasets," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, pp. 603-618, 2005.
  29. J. -x. Dong, A. Krzyzak, and C. Y. Suen, "A Practical SMO Algorithm for Pattern Classification," 2001.
  30. D. Keysers, R. Paredes, H. Ney, and E. Vidal, "Combination of Tangent Vectors and Local Representation for Handwritten Digit Recognition," in International Workshop on Statistical Pattern Recognition, 2002, pp. 538-547.
  31. J. Bromley and E. Sackinger, "Neural-network and k-nearest-neighbor classifiers," AT&T1991.
  32. J. -x. Dong, "Statistical Results of Human Performance on USPS database," Concordia University2001.
Index Terms

Computer Science
Information Sciences

Keywords

Slope Features Handwritten digits Pattern Classification Nearest Neighbor Support Vector Machine Artificial Intelligence Gradient Feature USPS Data Set