Application Informed Motion Signal Processing For Finger Motion Tracking Using Wearable Sensors
Yilin Liu, Fengyang Jiang, Mahanth Gowda
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 14:05
Finger motion tracking has applications in user-interfaces, sports analytics, medical rehabilitation and sign language translation. This paper presents a system called FinGTrAC that shows the feasibility of fine grained finger gesture tracking using low intrusive wearable sensor platform (smart-ring worn on the index finger and a smart-watch worn on wrist). Such sparse sensors are convenient to wear but cannot track all fingers and hence provide under-constrained information. However application specific context can fill the gap in sparse sensing and improve the accuracy of gesture classification. This paper shows the feasibility of exploiting such context in an application of American Sign Language (ASL) translation. Non-trivial challenges arise due to noisy sensor data, variations in gesture performance across users and the inability to capture data from all fingers. FinGTrAC exploits a number of opportunities in data preprocessing, filtering, pattern matching, context of an ASL sentence to systematically fuse the sensory information into a Bayesian filtering framework. Culminating into the design of a Hidden Markov Model, a Viterbi decoding scheme is designed to detect finger gestures and the corresponding ASL sentences. Evaluation on 10 users shows a detection accuracy of 94.2% for 100 most frequently used ASL finger gestures over sentences.