GENDER RECOGNITION ON RGB-D IMAGE
Xiaoxiong Zhang, Sajid Javed, Ahmad Obeid, Jorge Dias, Naoufel Werghi
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 06:38
In this paper, we propose a deep-learning approach for human gender classification on RGB-D images. Unlike most of the existing methods, which use hand-crafted features from the human face, we exploit local information from the head and global information from whole body to classify people's gender. A head detector is fine-tuned on YOLO to detect the head regions on the images automatically. Two gender classifiers are trained using head images and whole body images separately. The final prediction is made by fusing the two classifiers' result. The presented method outperforms the state-of-art with an improvement in the accuracy of 2.6\%, 7.6\%, and 8.4\% on three different test data of a challenging gender dataset which includes human standing, walking and interacting scenarios.