Deep Learning-Based Eye Gaze Controlled Robotic Car

Abstract
In recent years Eye gaze tracking (EGT) has emerged as an attractive alternative to conventional communication modes. Gaze estimation can be effectively used in human-computer interaction, assistive devices for motor-disabled persons, autonomous robot control systems, safe car driving, diagnosis of diseases and even in human sentiment assessment. Implementation in any of these areas however mostly depends on the efficiency of detection algorithm along with usability and robustness of detection process. In this context we have proposed a Convolutional Neural Network (CNN) architecture to estimate the eye gaze direction from detected eyes which outperforms all other state of the art results for Eye-Chimera dataset. The overall accuracies are 90.21% and 99.19% for Eye-Chimera and HPEG datasets respectively. This paper also introduces a new dataset EGDC for which proposed algorithm finds 86.93% accuracy. We have developed a real-time eye gaze controlled robotic car as a prototype for possible implementations of our algorithm.

This publication has 14 references indexed in Scilit: