We apologize for a recent technical issue with our email system, which temporarily affected account activations. Accounts have now been activated. Authors may proceed with paper submissions. PhDFocusTM
CFP last date
20 December 2024
Reseach Article

Hand Gestures for Hearing Impaired and Inarticulate People using CNN

by T. R. Shinde, Shyam Gupta
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 175 - Number 13
Year of Publication: 2020
Authors: T. R. Shinde, Shyam Gupta
10.5120/ijca2020920624

T. R. Shinde, Shyam Gupta . Hand Gestures for Hearing Impaired and Inarticulate People using CNN. International Journal of Computer Applications. 175, 13 ( Aug 2020), 38-41. DOI=10.5120/ijca2020920624

@article{ 10.5120/ijca2020920624,
author = { T. R. Shinde, Shyam Gupta },
title = { Hand Gestures for Hearing Impaired and Inarticulate People using CNN },
journal = { International Journal of Computer Applications },
issue_date = { Aug 2020 },
volume = { 175 },
number = { 13 },
month = { Aug },
year = { 2020 },
issn = { 0975-8887 },
pages = { 38-41 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume175/number13/31515-2020920624/ },
doi = { 10.5120/ijca2020920624 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T00:24:58.261803+05:30
%A T. R. Shinde
%A Shyam Gupta
%T Hand Gestures for Hearing Impaired and Inarticulate People using CNN
%J International Journal of Computer Applications
%@ 0975-8887
%V 175
%N 13
%P 38-41
%D 2020
%I Foundation of Computer Science (FCS), NY, USA
Abstract

Gesture languages are an extremely important communication tool for many dumb and hard-of-hearing people. Sign languages are the native languages of the Deaf and Dumb community and provide full access to communication. Thanks to hearing ability we are able to perceive thoughts of every different. However, what if one fully cannot hear something and eventually cannot speak. So, the Gesture Language is that the main human action tool for hearing impaired and mute folks, associate degreed additionally to make sure a freelance life for them, the automated interpretation of linguistic communication is an in-depth analysis space. With the utilization of image process and artificial intelligence, several techniques and algorithms have been developed during this space. Each linguistic communication recognition system is trained for recognizing the gestures/signs and changing them into needed pattern. The projected system aims to produce speech to unarticulated, during this paper the double two-handed Indian linguistic communication is captured as a series of pictures and it's processed with the assistance of Python so it can produce speech and text as output.

References
  1. Sunmok Kim, Yangho Ji and Ki-Beak Lee 2018 Second International Conference on Robotic Computing, An Effective Sign Language learning with object classification using ROI.
  2. Danilo Ayola, Member IEEE, Macro Bernardi, Student Member, IEEE, Luigi Cinque Senior Member, IEEE, Gian Luca Foresti, Senior Member, IEEE and Cristiano Massaroni, Student Member, IEEE, 2018 IEEE, Exploiting Recurrent Neural Networks and Leap Motion Controller for the Recognition of Sign Language and Semaphoric Hans Gestures.
  3. Bangladesh Mohammad Mahadi Hasan, Md. Khaliluzzaman, Shabiba Akhter Himel and Rukhsat Tasneem Chowdhury, 4th International Conference on advances in Electrical Engineering 28-30 September,2017, Dhaka, Hand Sign Language Recognition for Bangla Alphabet Based and ANN.
  4. Shirin Shanta, Saif Taifur Anwar, Md. Rayhanul Kabir, IEEE 9th ICCCNT 2018 Bangla Sign Detection using SIFT and CNN.
  5. Chana Chansri, Jakkree Srinonchat 2016 IEEE Thesis UMI order Number: GAX95-09398, University of Washington, Reliability and Accuracy of Thai Sign Language Recognition with Kinect Sensor.
  6. Suharjito, Herman Guna wanm Narada Thiracitta, Gunawan Witjaksono, The 1st 2018 INAPR International Conference, The Comparison of some Hidden Markov Models for Sign Language Recognition.
  7. Ariya Thongtawee, Onamon Pinsanoh, Yuttana Kitjaidure 2018 Biomedical Engineering International Conference (BMEiCON-2018) A Novel Feature Extraction for American Sign Language Recognition
  8. Mohamed Hassan, Khaled Assaleh, Tamer Shanableh,2016 International conference on Computational Science and Computational Intelligence, User-dependent Sign Language Recognition Using Motion Detection Tavel, P. 2007 Modeling and Simulation Design. AK Peters Ltd.
  9. Prateek SG, Siddarth R, P.G.Sunitha Hiremath, Jagdeesh J, Smitha Y, Neha Tarannum Pendari, International Conference on Advances in Computing, Communication Control and Networking (ICACCCN2018),@2018 IEEE, Dynamic Tool for American Sign Language Finger Spelling Interpreter.
  10. Prateek SG Jagadeesh I International Conference on Advances in Computing, Communication Control and Networking (ICACCCN2018), Dynamic Tool for American Sign Language
Index Terms

Computer Science
Information Sciences

Keywords

Machine learning Convolutional Neural Network Hand Gesture Threshold.