Hadamard Layer to Improve Semantic Segmentation
Angello Hoyos (Centro de Investigación en Matemáticas, A.C.); Mariano Rivera (Centro de Investigacion en Matematicas AC)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
The Hadamard Layer, a simple and computationally efficient way to improve results in semantic segmentation tasks, is presented. This layer has no free parameters that require to be trained. Therefore it does not increase the number of model
parameters, and the extra computational cost is marginal. Experimental results show that the new Hadamard layer substantially improves the performance of the investigated models (variants of the Pix2Pix model). The performance’s improve-
ment can be explained by the Hadamard layer forcing the network to produce an internal encoding of the classes so that all bins are active. Therefore, the network computation is more distributed. In a sort that the Hadamard layer requires that
to change the predicted class, it is necessary to modify 2^(k−1) bins, assuming k bins in the encoding. A specific loss function allows a stable and fast training convergence.