CONTROLLING THE FRCHET VARIANCE IMPROVES BATCH NORMALIZATION ON THE SYMMETRIC POSITIVE DEFINITE MANIFOLD
Reinmar Kobler, Jun-ichiro Hirayama, Motoaki Kawanabe
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:05:17
Symmetric positive definite (SPD) matrices, and in particular covariance matrices as data descriptors find widespread application in various fields but also pure machine learning in terms of covariance pooling in convolutional neural networks. SPD matrices form a Riemannian manifold, necessitating machine learning methods that take this structure into account. In this work, we extend upon previous works and propose a batch normalization algorithm for the SPD manifold that can be readily combined with SPD neural networks and unlike previous works controls both the Fr�chet mean and variance on the SPD manifold. The proposed method is validated in simulations and datasets with small sample sizes from three application domains: action recognition from human motion trajectories, image classification and mental imagery detection of electroencephalographic (EEG) signals. The combined results show a systematic performance increase upon previous works and tangent space approximations, as well as improved robustness to low signal-to-noise ratios and lack of data.