Combining Eye- and Head-Tracking Signals for Improved Event Detection
Abstract: Analysing eye movements recorded with mobile eye-tracking devices is difficult since the eye-tracking signals are severely affected by simultaneous head and body movements. The automatic analysis methods developed for static eye-tracking systems do not take this into account and are therefore not suitable for application to data which also contains head and body movements. As a result, data recorded using mobile eye trackers are often analysed manually. The goal of the present master’s thesis is to develop a method that can robustly detect the three most common types of eye movements from an eyetracking signal recorded with mobile eye-tracking glasses. Furthermore, the method should compensate for head movements, which are simultaneously recorded using an inertial measurement unit. A model for eye-in-space motion estimation is proposed which combines eye-tracking signals and head-tracking signals. In addition, a new enhanced event-detection algorithm for the classification of saccades, fixations, and smooth-pursuit movements is developed. In order to test the method, a pilot study is conducted. Moreover, the classification performance of the algorithm is evaluated by comparing the detected events to manual annotations and to the detected events of two existing algorithms. The results show that by compensating for head movements, the proposed algorithm is able to accurately perform ternary classification of eye movements based on mobile eye-tracking data. With sensitivities and specificities of over 95% for both a developmental and validation database, the proposed algorithm exhibits a considerably better detection performance than the two existing algorithms used for comparison.
AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)