CFP last date
20 May 2024
Reseach Article

Wheelchair Controlling by eye movements using EOG based Human Machine Interface and Artificial Neural Network

by Aminollah Golrou, Nasrin Rafiei, Mahdieh Sabouri
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
Volume 184 - Number 38
Year of Publication: 2022
Authors: Aminollah Golrou, Nasrin Rafiei, Mahdieh Sabouri
10.5120/ijca2022922465

Aminollah Golrou, Nasrin Rafiei, Mahdieh Sabouri . Wheelchair Controlling by eye movements using EOG based Human Machine Interface and Artificial Neural Network. International Journal of Computer Applications. 184, 38 ( Dec 2022), 12-18. DOI=10.5120/ijca2022922465

@article{ 10.5120/ijca2022922465,
author = { Aminollah Golrou, Nasrin Rafiei, Mahdieh Sabouri },
title = { Wheelchair Controlling by eye movements using EOG based Human Machine Interface and Artificial Neural Network },
journal = { International Journal of Computer Applications },
issue_date = { Dec 2022 },
volume = { 184 },
number = { 38 },
month = { Dec },
year = { 2022 },
issn = { 0975-8887 },
pages = { 12-18 },
numpages = {9},
url = { https://ijcaonline.org/archives/volume184/number38/32562-2022922465/ },
doi = { 10.5120/ijca2022922465 },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Journal Article
%1 2024-02-07T01:23:27.643635+05:30
%A Aminollah Golrou
%A Nasrin Rafiei
%A Mahdieh Sabouri
%T Wheelchair Controlling by eye movements using EOG based Human Machine Interface and Artificial Neural Network
%J International Journal of Computer Applications
%@ 0975-8887
%V 184
%N 38
%P 12-18
%D 2022
%I Foundation of Computer Science (FCS), NY, USA
Abstract

The use of vital signals as a connection interface between humans and computers has recently attracted a great deal of attention. The electro-oculogram (EOG) signal, which is due to eye potential, is one of these signals. More advanced, EOGbased Human-Machine Interfaces (HMIs) are widely investigated and considered to be a noble interface option for disabled people. Artificial neural networks were utilized in this study to detect eye movement from the EOG signal. Neural networks can detect and classify biological signals with nonlinear dynamics, including EOG signals, due to their ability to learn nonlinear dynamics and their pervasive approximation. In this study, two fundamentally distinct networks, MLP and ART, were used to detect sequential and random eye movements for controlling wheelchair. The results indicate that the MLP network could indeed detect consecutive eye movements with an accuracy of over 90%, although the accuracy of this network detection in the case of random movements is relatively poor. In the field of random eye movements, the greatest results are obtained using the ART2AE network, which allows having a diagnostic accuracy of over 70%.

References
  1. Lim, Y., Gardi, A., Ezer, N., Kistan, T., & Sabatini, R. (2018, June). Eye-tracking sensors for adaptive aerospace human-machine interfaces and interactions. In 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace) (pp. 311-316). IEEE.
  2. Ding, W. and Marchionini, G. 1997 A Study on Video Hossain, Z., Shuvo, M. M. H., &Sarker, P. (2017, September). Hardware and software implementation of real time electrooculogram (EOG) acquisition system to control computer cursor with eyeball movement. In 2017 4th international conference on advances in electrical engineering (ICAEE) (pp. 132-137). IEEE.
  3. Mifsud, M., Camilleri, T. A., & Camilleri, K. P. (2022). Dwell-Free Typing Using an EOG Based Virtual Keyboard. In International Conference on Human-Computer Interaction (pp. 54-62). Springer, Cham.
  4. Jameel, Huda Farooq, Sadik Kamel Gharghan, and Saleem Latteef Mohammed. "Wheelchair Control System for the Disabled Based on EMOTIV Sensor Gyroscope." Microprocessors and Microsystems (2022): 104686.
  5. Gesmallah, AbubakrElsadig, Amna Yousof Mohamed, and EltayebMohamedSalihEltayeb. Design of a voicecontrolled wheelchair for disabled people Simulated Using Mobile Bluetooth Connection. Diss. 2022.
  6. Chakraborty, Partha, MofizulAlamMozumder, and Md Saif Hasan. "Eye-Gaze-Controlled Wheelchair System with Virtual Keyboard for Disabled Person Using Raspberry Pi." Machine Intelligence and Data Science Applications. Springer, Singapore, 2022. 49-61.
  7. Barea, R., Boquete, L., Mazo, M., & López, E. (2002). Wheelchair guidance strategies using EOG. Journal of intelligent and robotic systems, 34(3), 279-299.
  8. Khademi, M., Mousavi Hondori, H., McKenzie, A., Dodakian, L., Lopes, C. V., & Cramer, S. C. (2014). Free-hand interaction with leap motion controller for stroke rehabilitation. In CHI'14 Extended Abstracts on Human Factors in Computing Systems (pp. 1663-1668).
  9. Qi, J., Jiang, G., Li, G., Sun, Y., & Tao, B. (2019). Intelligent human-computer interaction based on surface EMG gesture recognition. Ieee Access, 7, 61378-61387 In Distributed Systems, S. Mullender
  10. Kumar, N., & Kumar, J. (2016). Measurement of cognitive load in HCI systems using EEG power spectrum: an experimental study. Procedia Computer Science, 84, 70-78.
  11. Girouard, A., Hirshfield, L. M., Solovey, E., & Jacob, R. J. (2008). Using functional Near-Infrared Spectroscopy in HCI: Toward evaluation methods and adaptive interfaces. In Proc. chi 2008 workshop on brain-computer interfaces for hci and games.
  12. Lee, K. R., Chang, W. D., Kim, S., ℑ, C. H. (2016). Real-time “eye-writing” recognition using electrooculogram. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 25(1), 37-48.
  13. Bulling, A., Ward, J. A., Gellersen, H., &Tröster, G. (2009, September). Eye movement analysis for activity recognition. In Proceedings of the 11th international conference on Ubiquitous computing (pp. 41-50).
  14. Bulling, A., Ward, J. A., Gellersen, H., &Tröster, G. (2010). Eye movement analysis for activity recognition using electrooculography. IEEE transactions on pattern analysis and machine intelligence, 33(4), 741-753.
  15. Ianez, E., Azorin, J. M., & Perez-Vidal, C. (2013). Using eye movement to control a computer: A design for a lightweight electro-oculogram electrode array and computer interface. PloS one, 8(7), e67099.
  16. Kim, D. Y., Han, C. H., ℑ, C. H. (2018). Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients. Scientific reports, 8(1), 1-10.
  17. Barea, R., Boquete, L., Mazo, M., & López, E. (2002). System for assisted mobility using eye movements based on electrooculography. IEEE transactions on neural systems and rehabilitation engineering, 10(4), 209-218.
Index Terms

Computer Science
Information Sciences

Keywords

EOG Human-Machine Interfaces (HMIs) Eye movements Tracking MLP ART