Mobile Device Gaze Estimation with Deep Learning : Using Siamese Neural Networks

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Abstract: Gaze tracking has already shown to be a popular technology for desktop devices. When it comes to gaze tracking for mobile devices, however, there is still a lot of progress to be made. There’s still no high accuracy gaze tracking available that works in an unconstrained setting for mobile devices. This work makes contributions in the area of appearance-based unconstrained gaze estimation. Artificial neural networks are trained on GazeCapture, a publicly available dataset for mobile gaze estimation containing over 2 million face images and corresponding gaze labels. In this work, Siamese neural networks are trained to learn linear distances between face images for different gaze points. Then, during inference, calibration points are used to estimate gaze points. This approach is shown to be an effective way of utilizing calibration points in order to improve the result of gaze estimation.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)