Gauntlet-X1: Smart Glove System for American Sign Language translation using Hand Activity Recognition

University essay from Uppsala universitet/Institutionen för informationsteknologi

Author: Asif Mohamed; Paul Sujeet; Vishnu Ullas; [2020]

Keywords: ;

Abstract: The most common forms of Human Computer Interaction (HCI) devices these dayslike the keyboard, mouse and touch interfaces, are limited to working on atwo-dimensional (2-D) surface, and thus do not provide complete freedom ofaccessibility using our hands. With the vast number of gestures a hand can perform,including the different combinations of motion of fingers, wrist and elbow, we canmake accessibility and interaction with the digital environment much more simplified,without restrictions to the physical surface. Fortunately, this is possible due to theadvancements of Microelectromechanical systems (MEMS) manufacturing of sensors,reducing the size of a sensor to the size of a fingernail.In this thesis we document the design and development of a smart glove systemcomprising of Inertial Measurement Units (IMU) sensors that recognize handactivity/gestures using combinations of neural networks and deep learning techniquessuch as Long Short-Term Memory (LSTM) and Convolutional Neural Network(CNN). This peripheral device is named as the Gauntlet-X1, X1 to denote thecurrent prototype version of the device. The system captures IMU data and interfaceswith the host server. In order to demonstrate this prototype as a proof of concept,we integrate to Android mobile applications based on 3-D interactivity like theAmerican Sign Language(ASL), Augmented Reality (AR)/Virtual Reality (VR)applications and can be extended to further the use of HCI technology.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)