MEASURE4DHAND: DYNAMIC HAND MEASUREMENT EXTRACTION FROM 4D SCANS
Xinxin Dai, Ran Zhao, Pengpeng Hu, Vasile Palade, Adrian Munteanu
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Hand measurement is vital for hand-centric applications such as glove design, immobilization design, and protective gear design, to name a few. Vision-based methods been previously proposed but are limited to extract hand dimensions in a static and standardized posture. However, dynamic hand measurements should be considered when designing these wearable products since the interaction between hands and the products cannot be ignored. Unfortunately, none of existing methods are designed for measuring dynamic hands. To address this problem, we propose a novel method to extract dynamic hand measurements from a sequence of depth images captured by a single depth camera. Firstly, ten dimensions of the hand are defined. Secondly, a deep neural network is developed to predict landmark sequences for the ten dimensions from partial point cloud sequences. Finally, a measurement method is designed to calculate dimension values from landmark sequences. A novel synthetic dataset consisting of 234K hands in various shapes and poses along with corresponding-ground truth landmarks is proposed to train the proposed method. Real-world scans captured by Kinect are utilized to illustrate the evolution of the ten dimensions during hand movement, while the mean ranges of variation are also reported, providing valuable information for hand wearable product design.