CFP last date
20 January 2025
Reseach Article

A Survey on Hand Gesture Recognition Systems

by Ankith A. Prabhu, E. Sasikala
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 180 - Number 30
Year of Publication: 2018
Authors: Ankith A. Prabhu, E. Sasikala
10.5120/ijca2018916759

Ankith A. Prabhu, E. Sasikala . A Survey on Hand Gesture Recognition Systems. International Journal of Computer Applications. 180, 30 ( Apr 2018), 17-20. DOI=10.5120/ijca2018916759

@article{ 10.5120/ijca2018916759,
author = { Ankith A. Prabhu, E. Sasikala },
title = { A Survey on Hand Gesture Recognition Systems },
journal = { International Journal of Computer Applications },
issue_date = { Apr 2018 },
volume = { 180 },
number = { 30 },
month = { Apr },
year = { 2018 },
issn = { 0975-8887 },
pages = { 17-20 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume180/number30/29233-2018916759/ },
doi = { 10.5120/ijca2018916759 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:02:15.025661+05:30
%A Ankith A. Prabhu
%A E. Sasikala
%T A Survey on Hand Gesture Recognition Systems
%J International Journal of Computer Applications
%@ 0975-8887
%V 180
%N 30
%P 17-20
%D 2018
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Gesture recognition can be said to be the interpretation of human gestures via mathematical models. Gestures can originate from any bodily motion but this paper focuses exclusively on hand gesture recognition. Hand gesture recognition is referred to as a Perceptual User Interface (PUI). A Perceptual User Interface allows Human Computer Interaction (HCI) without the use of a mouse or a keyboard. Gestures are used primarily to interact with devices without any physical contact with the device. Successful gesture recognition is dependent on the accuracy and efficiency of gesture classification. The gestures are classified using dynamic programming, machine learning or deep learning techniques. In all gesture recognition systems, the relevant input data is collected by a number of sensors. A good gesture recognition system uses this input data to classify the gesture accurately and efficiently. Gesture recognition is deployed in a number of fields like in the medical field where is used to make sign language interpretation devices for the vocally impaired, in virtual gaming and in smart home environments.

References
  1. Q. Chen, N. D. Georganas and E. M. Petriu, ”Real-time Vision based Hand Gesture Recognition Using Haar-like Features,”2007IEEE Instrumentation & Measurement Technology Conference IMTC 2007.
  2. K. K. Biswas and S. K. Basu, ”Gesture recognition using Microsoft Kinect,” The 5th International Conference on Automation, Robotics and Applications, Wellington, 2011, pp. 100-103.
  3. S. Mitra and T. Acharya, ”Gesture Recognition: A Survey,” in IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), vol. 37, no. 3, pp. 311-324, May 2007.
  4. Jiayang Liu, Zhen Wang, Lin Zhong, Jehan Wickramasuriya, and Venu Vasudevan, "uWave: Accelerometer-based personalized gesture recognition and its applications," in IEEE Int. Conf. Pervasive Computing and Communication (PerCom), March 2009.
  5. SmartWatch Gestures Dataset, Technologies of Vision, Fondazione Bruno Kessler, [Online] Available: https://tev.fbk.eu/technologies/smartwatch-gestures-dataset.
  6. Gabriele Costante, Lorenzo Porzi, Oswald Lanz, Paolo Valigi, Elisa Ricci, Personalizing a Smartwatch-based Gesture Interface With Transfer Learning, 22nd European Signal Processing Conference (EUSIPCO), 2014.
  7. Anetha K and Rejina Parvin J. (2014, Jul). Hand Talk-A Sign Language Recognition Based On Accelerometer and SEMG Data. IJIRCCE. [online]. 2(3), pp.206-215.
  8. Pylvänäinen T. (2005) Accelerometer Based Gesture Recognition Using Continuous HMMs. In: Marques J.S., Pérez de la Blanca N., Pina P. (eds) Pattern Recognition and Image Analysis. IbPRIA 2005. Lecture Notes in Computer Science, vol 3522. Springer, Berlin, Heidelberg.
  9. X. Zhang, X. Chen, Y. Li, V. Lantz, K. Wang, and J. Yang, “A framework for hand gesture recognition based on accelerometer and emg sensors,” IEEE Trans. Syst., Man, Cybern. A, vol. 41, no. 6, pp. 1064–1076, 2011.
  10. H. Sekar, R. Rajashekar, G. Srinivasan, P. Suresh and V. Vijayaraghavan, ”Low-cost intelligent static gesture recognition system,” 2016 Annual IEEE Systems Conference (SysCon), Orlando, FL, 2016, pp. 1-6.
  11. T. Chouhan, A. Panse, A. K. Voona and S. M. Sameer, ”Smart glove with gesture recognition ability for the hearing and speech impaired,” 2014 IEEE Global Humanitarian Technology Conference - South Asia Satellite (GHTC-SAS), Trivandrum, 2014, pp. 105-110.
  12. Lefebvre G., Berlemont S., Mamalet F., Garcia C. (2013) BLSTM-RNN Based 3D Gesture Classification. In: Mladenov V., Koprinkova-Hristova P., Palm G., Villa A.E.P., Appollini B., Kasabov N. (eds) Artificial Neural Networks and Machine Learning – ICANN 2013. ICANN 2013. Lecture Notes in Computer Science, vol 8131. Springer, Berlin, Heidelberg.
  13. Starner T., Weaver J., Pentland A., “Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video”, IEEE TPAMI, IEEE CS, Vol 20, Issue 12, Dec 1998, pp. 1371-1375.
  14. Hussain S.M.A., Harun-ur Rashid A.B.M., User Independent Hand Gesture Recognition by Accelerated DTW, IEEE/OSA/IAPR International Conference on Informatics, Electronics & Vision, Proceedings, Dhaka, Bangladesh2012.
Index Terms

Computer Science
Information Sciences

Keywords

Multi-modal User Dependent User Independent Mixed User.