CFP last date
20 January 2025
Reseach Article

Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry

by Jai Vardhan Singh, Girijesh Prasad
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 130 - Number 16
Year of Publication: 2015
Authors: Jai Vardhan Singh, Girijesh Prasad
10.5120/ijca2015907194

Jai Vardhan Singh, Girijesh Prasad . Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry. International Journal of Computer Applications. 130, 16 ( November 2015), 16-22. DOI=10.5120/ijca2015907194

@article{ 10.5120/ijca2015907194,
author = { Jai Vardhan Singh, Girijesh Prasad },
title = { Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry },
journal = { International Journal of Computer Applications },
issue_date = { November 2015 },
volume = { 130 },
number = { 16 },
month = { November },
year = { 2015 },
issn = { 0975-8887 },
pages = { 16-22 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume130/number16/23293-2015907194/ },
doi = { 10.5120/ijca2015907194 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-06T23:25:46.776234+05:30
%A Jai Vardhan Singh
%A Girijesh Prasad
%T Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry
%J International Journal of Computer Applications
%@ 0975-8887
%V 130
%N 16
%P 16-22
%D 2015
%I Foundation of Computer Science (FCS), NY, USA
Abstract

In natural course, human beings usually make use of multi-sensory modalities for effective communication or efficiently executing day-to-day tasks. For instance, during verbal conversations we make use of voice, eyes, and various body gestures. Also effective human-computer interaction involves hands, eyes, and voice, if available. Therefore by combining multi-sensory modalities, we can make the whole process more natural and ensure enhanced performance even for the disabled users. Towards this end, we have developed a multi-modal human-computer interface (HCI) by combining an eye-tracker with a soft-switch which may be considered as typically representing another modality. This multi-modal HCI is applied for text entry using a virtual keyboard appropriately designed in-house, facilitating enhanced performance. Our experimental results demonstrate that using multi-modalities for text entry through the virtual keyboard is more efficient and less strenuous than single modality system and also solves the Midas-touch problem, which is inherent in an eye-tracker based HCI system where only dwell time is used for selecting a character.

References
  1. Johansen A S and Hansen J P 2006 Augmentative and alternative communication: the future of text on the move Universal Access in the Information Society 5 125-49
  2. Zhai S, Morimoto C and Ihde S 1999 Manual and gaze input cascaded (MAGIC) pointing Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI ‘99 (Pittsburgh) (New York: ACM) pp 246-53
  3. Jacob R J K 1991 The use of eye movements in human-computer interaction techniques: what you look at is what you get ACM T. Inform. Syst. 9 152-69
  4. Hansen D W and Ji Q 2010 In the eye of the beholder: A survey of models for eyes and gaze IEEE Trans. Pattern Anal. Mach. Intell. 32 478-500
  5. Sibert L E and Jacob R J K 2000 Evaluation of eye gaze interaction Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI ’00 (Hague) (New York: ACM) pp 281-8
  6. Murata A 2006 Eye‐gaze input versus mouse: Cursor control as a function of age International Journal of Human‐Computer Interaction 21 1-14
  7. Duchowski A T 2002 A breadth-first survey of eye-tracking applications Behavior Research Methods, Instruments, & Computers 34 455-70
  8. Majaranta P and Räihä K -J 2002 Twenty years of eye typing: systems and design issues Proc. of the 2002 Symp. on Eye Tracking Research & Applications ETRA'02 (New Orleans) (New York: ACM) pp 15-22
  9. Tuisku O, Majaranta P, Isokoski P and Räihä K –J 2008 Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze Proc. of the 2008 Symp. on Eye Tracking Research & Applications ETRA ’08 (Savannah) (New York: ACM) pp 19-26
  10. Ward D J and Mackay D J C 2002 Fast hands-free writing by gaze direction Nature 418 838 (arXiv preprint cs/0204030)
  11. Kumar M, Paepcke A and Winograd T 2007 EyePoint: practical pointing and selection using gaze and keyboard Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI '07 (San Jose) (New York: ACM) pp 421-30
  12. Istance H, Bates R, Hyrskykari A and Vickers S 2008 Snap clutch, a moded approach to solving the Midas touch problem Proc. of the 2008 Symp. on Eye Tracking Research & Applications ETRA ’08 (Savannah) (New York: ACM) pp 221-8
  13. Chapman, J E 1991 Use of an eye-operated computer system in locked-in syndrome In Proc. of the Sixth Annual International Conference on Technology and Persons with Disabilities CSUN’91(Los Angeles)
  14. Majaranta P, MacKenzie I S, Aula A and Räihä K -J 2006 Effects of feedback and dwell time on eye typing speed and accuracy Universal Access in the Information Society  5 199-208
  15. Bee N and André E 2008 Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze Perception in Multimodal Dialogue Systems vol 5078, ed E André et al (Berlin Heidelberg: Springer) pp 111-22
  16. Shein, G F 1997 Towards task transparency in alternative computer access: selection of text through switch-based scannjng PhD diss., University of Toronto
  17. Majaranta P and Räihä K –J 2007 Text entry by gaze: Utilizing eye-tracking Text Entry Systems: Mobility, Accessibility, Universality (San Francisco: Morgan Kaufmann Publishers Inc) pp 175-87
  18. Hansen J P, Tørning K, Johansen A S, Itoh K and Aoki H 2004 Gaze typing compared with input by head and hand Proc. of the 2004 Symp. on Eye Tracking Research & Applications ETRA '04 (New York: ACM) pp 131-8
  19. Pannasch S, Helmert J R, Malischke S, Storch A and Velichkovsky B M 2008 Eye typing in application: A comparison of two systems with ALS patients Journal of Eye Movement Research 2 1-8
  20. Bolt R A 1980 Put-that-there: Voice and gesture at the graphics interface Proc. of the 7th Annual Conf. on Computer Graphics and Interactive Techniques SIGGRAPH '80 (New York: ACM)
  21. Špakov O and Miniotas D 2004 On-line adjustment of dwell time for target selection by gaze  Proc. of the Third Nordic Conf. on Human-Computer Interaction NordiCHI '04(Tampere) (New York: ACM) pp 203-6
  22. Majaranta P, Ahola U -K and Špakov O 2009 Fast gaze typing with an adjustable dwell time  Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI ’09 (Boston) (New York: ACM) pp 357-60
  23. Sharma R, Pavlovic V I and Huang T S 1998 Toward multimodal human-computer interface Proc. IEEE 86 853-69
  24. Oviatt S 2003 Advances in robust multimodal interface design IEEE Comput. Graph. Appl. 23 62-8
  25. Kaur M, Tremaine M, Huang N, Wilder J, Gacovski Z, Flippo F and Mantravadi C S 2003 Where is it? Event synchronization in gaze-speech input systems Proc. of the 5th Int. Conf. on Multimodal Interfaces ICMI ’03 (Vancouver) (New York: ACM) pp 151-8
  26. Oviatt S, DeAngeli A and Kuhn K 1997 Integration and synchronization of input modes during multimodal human-computer interaction Proc. of the ACM SIGCHI Conf. on Human Factors in Computing Systems CHI '97 (Atlanta) (New York: ACM) pp 415-22
  27. Maglio P P, Matlock T, Campbell C S, Zhai S and Smith B A 2000 Gaze and speech in attentive user interfaces In Advances in Multimodal Interfaces—ICMI 2000 vol 1948, ed T Tan et al (Berlin Heidelberg: Springer) pp 1-7
  28. Prabhu V and Prasad G 2011 Designing a virtual keyboard with multi-modal access for people with disabilities 2011 World Congress on Information and Communication Technologies (WICT) (Mumbai) (USA: IEEE) pp 1133-8
  29. MacKenzie I S and Zhang X 2008 Eye typing using word and letter prediction and a fixation algorithm  Proc. of the 2008 Symp. on Eye Tracking Research & Applications ETRA ’08 (Savannah) (New York: ACM) pp 55-8
  30. Soukoreff R W and MacKenzie I S 2001 Measuring errors in text entry tasks: an application of the Levenshtein string distance statistic CHI '01 Extended Abstracts on Human Factors in Computing Systems (Seattle) (New York: ACM) pp 319-20
  31. MacKenzie I S 2002 KSPC (keystrokes per character) as a characteristic of text entry techniques Human Computer Interaction with Mobile Devices vol 2411, ed F Paternò (Berlin Heidelberg: Springer) pp 195-210
  32. Soukoreff R W and MacKenzie I S 2003 Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI ’03 (Florida) (New York: ACM) pp 113-20
  33. Ware C and Mikaelian H H 1987 An evaluation of an eye tracker as a device for computer input2 Proc. of the SIGCHI/GI Conf. on Human Factors in Computing Systems and Graphics Interface CHI '87 (Toronto) (New York: ACM) pp 183-8
  34. Porta M and Turina M 2008 Eye-S: a full-screen input modality for pure eye-based communication Proc. of the 2008 Symp. on Eye Tracking Research & Applications ETRA ’08 (Savannah) (New York: ACM) pp 27-34
  35. Urbina M H and Huckauf A 2010 Alternatives to single character entry and dwell time selection on eye typing Proc. of the 2010 Symp. on Eye Tracking Research & Applications ETRA ’10 (Austin) (New York: ACM) pp 315-22
  36. Helmert J R, Pannasch S and Velichkovsky B M 2008 Influences of dwell time and cursor control on the performance in gaze driven typing Journal of Eye Movement Research 2 1-8
  37. Zhao X A, Guestrin E D, Sayenko D, Simpson T, Gauthier M and Popovic M R 2012 Typing with eye-gaze and tooth-clicks Proc. of the 2012 Symp. on Eye Tracking Research & Applications ETRA ’12 (Santa Barbara) (New York: ACM) pp 341-4
  38. The View Point EyeTracker®: http://www.arringtonresearch.com/
  39. AbleData : http://www.abledata.com/abledata.cfm?pageid=113582&orgid=109986 , Accessed on Feb 11,2014
  40. Soft Switch : http://www.abledata.com/abledata.cfm?pageid=113583⊤=0&productid=96746&trail=0, Accessed on Feb 11,2014
  41. USB Switch Interface : http://www.abledata.com/abledata.cfm?pageid=19327⊤=15584&ksectionid=0&productid=75667&trail=22,11114,11131&discontinued=0 , Accessed on Feb 11,2014
  42. Abbott W W and Faisal A A 2012 Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces J. Neural Eng. 9 046016
  43. Morimoto C H and Amir A 2010 Context switching for fast key selection in text entry applications Proc. of the 2010 Symp. on Eye Tracking Research & Applications ETRA ’10 (Austin) (New York: ACM) pp 271-4
  44. Vertanen K and MacKay D J C 2010 Speech dasher: fast writing using speech and gaze Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI ’10 (Atlanta) (New York: ACM) pp 595-8
  45. Beelders T R and Blignaut P J 2012 Measuring the performance of gaze and speech for text input Proc. of the 2012 Symp. on Eye Tracking Research & Applications ETRA ’12 (Santa Barbara) (New York: ACM) pp 337-40
  46. Hoste L, Dumas B and Signer B 2012 SpeeG: a multimodal speech-and gesture-based text input solution Proc. of the Int. Working Conf. on Advanced Visual Interfaces AVI ’12 (Capri Island, Italy) (New York: ACM) pp 156-63
  47. Kumar A, Paek T and Lee B 2012 Voice typing: a new speech interaction model for dictation on touchscreen devices Proc. of the SIGCHI Conf. on Human Factors in Computing Systems CHI ’12 (Austin) (New York: ACM) pp 2277-86
Index Terms

Computer Science
Information Sciences

Keywords

Multi-modal eye-tracker HCI eye typing eye-gaze.