Skip to main content

IMPLICITLY ROTATION EQUIVARIANT NEURAL NETWORKS

Naman Khetan (IIT (ISM) Dhanbad); Tushar Arora (IIT (ISM) Dhanbad); Samee Ur Rehman (Transmute AI); Deepak K Gupta (UiT The Arctic University of Norway)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Convolutional Neural Networks (CNN) are inherently equivariant under translations, however, they do not have an equivalent embedded mechanism to handle other transformations such as rotations. The existing solutions require redesigning standard networks with filters mapped from combinations of predefined basis involving complex analytical functions. Such formulations are hard to implement as well as the imposed restrictions in the choice of basis can lead to model weights that are sub-optimal for the primary deep learning task (e.g. classification). We propose Implicitly Equivariant Network (IEN) which induces approximate equivariance in the different layers of a standard CNN by optimizing a multi-objective loss function. We show for ResNet models on Rot-MNIST and Rot-TinyImageNet that even with its simple formulation, IEN performs at par or even better than steerable networks. Also, IEN facilitates construction of heterogeneous filter groups allowing reduction in the number of channels in CNNs by a factor of over 30%. Further, we demonstrate that for the hard problem of visual object tracking, IEN outperforms the state-of-the-art rotation equivariant tracking method while providing faster inference speed.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00