Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures

Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures In this work, we show how our open source accessibility software, the FaceSwitch, can help motor-impaired subjects to efficiently interact with a computer hands-free. The FaceSwitch enhances gaze interaction with video-based face gestures interaction. The emerging multimodal system allows for interaction with a user interface by means of gaze pointing for target selection and facial gestures for target-specific action commands. The FaceSwitch maps facial gestures to specific mouse or keyboard events such as: left mouse click, right mouse click, or page scroll down. Hence, facial gestures serve the purpose of mechanical switches. With this multimodal interaction paradigm, the user gazes at the object in the user interface with which it wants to interact and then triggers a target-specific action by performing a face gesture. Through a rigorous user study, we have obtained quantitative evidence that suggests our proposed interaction paradigm improves the performance of traditional accessibility options, such as gaze-only interaction or gaze with a single mechanical switch interaction while coming close in terms of speed and accuracy with traditional mouse-based interaction. We make the FaceSwitch software freely available to the community so the output of our research can help the target audience. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png ACM Transactions on Accessible Computing (TACCESS) Association for Computing Machinery

Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures

Loading next page...
 
/lp/association-for-computing-machinery/fast-human-computer-interaction-by-combining-gaze-pointing-and-face-czTsr7sPe0
Publisher
Association for Computing Machinery
Copyright
Copyright © 2017 ACM
ISSN
1936-7228
eISSN
1936-7236
DOI
10.1145/3075301
Publisher site
See Article on Publisher Site

Abstract

In this work, we show how our open source accessibility software, the FaceSwitch, can help motor-impaired subjects to efficiently interact with a computer hands-free. The FaceSwitch enhances gaze interaction with video-based face gestures interaction. The emerging multimodal system allows for interaction with a user interface by means of gaze pointing for target selection and facial gestures for target-specific action commands. The FaceSwitch maps facial gestures to specific mouse or keyboard events such as: left mouse click, right mouse click, or page scroll down. Hence, facial gestures serve the purpose of mechanical switches. With this multimodal interaction paradigm, the user gazes at the object in the user interface with which it wants to interact and then triggers a target-specific action by performing a face gesture. Through a rigorous user study, we have obtained quantitative evidence that suggests our proposed interaction paradigm improves the performance of traditional accessibility options, such as gaze-only interaction or gaze with a single mechanical switch interaction while coming close in terms of speed and accuracy with traditional mouse-based interaction. We make the FaceSwitch software freely available to the community so the output of our research can help the target audience.

Journal

ACM Transactions on Accessible Computing (TACCESS)Association for Computing Machinery

Published: Aug 11, 2017

Keywords: Gaze interaction

References