On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks
Dang Nguyen (VinAI); Thien Trang Nguyen Vu (Hanoi University of Science and Technology); Khai Nguyen (University of Texas at Austin); Dinh Q Phung (Monash University); Hung Bui (VinAI Research); Nhat Ho (University of Texas at Austin)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
OTFusion, or layer-wise model fusion via optimal transport, applies soft neuron association to unify different pre-trained networks. Despite its effectiveness in saving computational resources, OTFusion requires the input networks to have the same number of layers. To address this issue, we propose a novel model fusion framework, named CLAFusion, to fuse neural networks with different numbers of layers, which we refer to as heterogeneous neural networks, via cross-layer alignment. We demonstrate that the cross-layer alignment problem, which is an unbalanced assignment problem, can be solved efficiently using dynamic programming. Based on the cross-layer alignment, our framework balances the number of layers of neural networks before applying layer-wise model fusion. Our experiments indicate that CLAFusion, with an extra finetuning process, improves the accuracy of residual networks on the CIFAR10, CIFAR100, and Tiny-ImageNet datasets. Furthermore, we explore its practical usage for model compression and knowledge distillation when applied to the teacher-student setting.