Removing Dimensional Restrictions On Complex/Hyper-Complex Neural Networks
Chase Gaudet, Anthony Maida
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:06:07
It has been shown that the core reasons that complex and hypercomplex valued neural networks offer improvements over their real-valued counterparts is the fact that aspects of their algebra forces treating multi-dimensional data as a single entity. However, both are constrained to a set number of dimensions, two for complex and four for quaternions. These observations motivate us to introduce novel vector map convolutions which capture this property, while dropping the unnatural dimensionality constraints their algebra imposes. This is achieved by introducing a system that mimics the unique linear combination of input dimensions via the Hamilton product using a permutation function, as well as batch normalization and weight initialization for the system. We perform two experiments to show that these novel vector map convolutions seem to capture all the benefits of complex and hyper-complex networks, while avoiding the dimensionality restriction.