High-Dimensional Neural Feature Using Rectified Linear Unit And Random Matrix Instance
Alireza M. Javid, Arun Venkitaraman, Mikael Skoglund, Saikat Chatterjee
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 12:41
We design a ReLU-based multilayer neural network to generate a rich high-dimensional feature vector. The feature guarantees a monotonically decreasing training cost as the number of layers increases. We design the weight matrix in each layer to extend the feature vectors to a higher dimensional space while providing a richer representation in the sense of training cost. Linear projection to the target in the higher dimensional space leads to a lower training cost if a convex cost is minimized. An $\ell_2$-norm convex constraint is used in the minimization to improve the generalization error and avoid overfitting. The regularization hyperparameters of the network are derived analytically to guarantee a monotonic decrement of the training cost and therefore, it eliminates the need for cross-validation to find the regularization hyperparameter in each layer.