Symmetric Evaluation of Multimodal Human–Robot Interaction with Gaze and Standard Control
Open Access
- 1 December 2018
- Vol. 10 (12), 680
- https://doi.org/10.3390/sym10120680
Abstract
Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( ) and was competitive with the controller only setup, although did not outperform it ( ).
Keywords
This publication has 22 references indexed in Scilit:
- Multimodal interaction: A reviewPattern Recognition Letters, 2014
- Wireless Face Interface: Using voluntary gaze direction and facial muscle activations for human–computer interactionInteracting with Computers, 2012
- Establishing aesthetics based on human graph reading behavior: two eye tracking studiesPersonal and Ubiquitous Computing, 2011
- Investigating task-dependent top-down effects on overt visual attentionJournal of Vision, 2010
- Measuring Effectiveness of Graph Visualizations: A Cognitive Load PerspectiveInformation Visualization, 2009
- Nasa-Task Load Index (NASA-TLX); 20 Years LaterProceedings of the Human Factors and Ergonomics Society Annual Meeting, 2006
- Eye gaze tracking techniques for interactive applicationsComputer Vision and Image Understanding, 2005
- Cognitive Load Theory and Instructional Design: Recent DevelopmentsEducational Psychologist, 2003
- Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical ResearchPublished by Elsevier BV ,1988
- “Put-that-there”ACM SIGGRAPH Computer Graphics, 1980