Regularizing Neural Networks by Stochastically Training Layer Ensembles
Alex Labach,Shahrokh Valaee
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 12:30
Dropout and similar stochastic neural network regularization methods are often interpreted as implicitly averaging over a large ensemble of models. We propose STE (stochastically trained ensemble) layers, which enhance the averaging properties of such methods by training an ensemble of weight matrices with stochastic regularization while explicitly averaging outputs. This provides stronger regularization with no additional computational cost at test time. We show consistent improvement on various image classification tasks using standard network topologies.