Skip to main content

LADDER SIAMESE NETWORK: A METHOD AND INSIGHTS FOR MULTI-LEVEL SELF-SUPERVISED LEARNING

Ryota Yoshihashi, Shuhei Nishimura, Dai Yonebayashi, Yuya Otsuka, Tomohiro Tanaka, Takashi Miyazaki

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
Poster 11 Oct 2023

In Siamese-network-based self-supervised learning (SSL), multi-level supervision (MLS) is a natural extension to enforce intermediate representations’ consistency against data augmentations. Although existing studies incorporated MLS to boost their system performances in combination with other ideas, simple MLS without bells and whistles is not deeply analyzed. Here, we extensively investigate how MLS works and how large an impact it has on SSL performance with various training settings to understand the effectiveness of MLS by itself. For this investigation, we develop a simple Siamese-SSL-based MLS framework Ladder Siamese Network, equipped with multi-level, non-contrastive, and global/local self-supervised training losses. We show that the proposed framework can simultaneously improve BYOL baselines in classification, detection, and segmentation solely by adding MLS. In comparison with the state-of-the-art methods, our Ladder-based model achieves competitive and balanced performances in all tested benchmarks without causing large degradation in one, which suggests the usability for building a multi-purpose backbone.