Combining Eye Tracking and Gestures to Interact with a Computer System
Abstract: Eye tracking and gestures are relatively new input methods, changing the way humans interact with computers. Gestures can be used for games or controlling a computer through an interface. Eye tracking is another way of interacting with computers, often by combining with other inputs such as a mouse or touch pad. Gestures and eye tracking have been used in commercially available products, but seldom combined to create a multimodal interaction. This thesis presents a prototype which combines eye tracking with gestures to interact with a computer. To accomplish this, the report investigates different methods of recognizing hand gestures. The aim is to combine the technologies in such a way that the gestures can be simple, and the location of a user’s gaze will decide what the gesture does. The report concludes by presenting a final prototype where the gestures are combined with eye tracking to interact with a computer. The final prototype uses an IR camera together with an eye tracker. The final prototype is evaluated with regards to learnability, usefulness, and intuitiveness. The evaluation of the prototype shows that usefulness is low, but learnability and intuitiveness are quite high.
AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)