Skip to main content

Ultimate Negative Sampling For Contrastive Learning

Huijie Guo (Beihang University); Lei Shi (Beihang University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Unsupervised learning has received more attention due to the superior performance of contrastive learning methods. Most contrastive methods use data augmentation techniques to construct positive and negative pairs. The augmented vision of the same sample is regarded as a positive sample, while the rest are negative samples. This negative sampling strategy has strong randomness and ignores samples that are semantically similar to anchors, namely sampling bias. This problem has been addressed by weighting the similarity of negative samples. In this paper, we propose a novel ultimate negative sampling for contrastive learning. Unlike random sampling, we set a more extreme negative sample selection mechanism based on the ideal representation of the sample. Furthermore, we constrain the consistency between samples across the space. We demonstrate the proposed method's superiority on multiple benchmark datasets.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00