Skip to main content

Permutation Invariant Training for Paraphrase Identification

Jun Bai (Beihang University); Chuantao Yin (Beihang University); Hanhua Hong (Beihang University); Jianfei zhang (Beihang University); Chen Li (Beihang University); Yanmeng Wang (Ping An Technology); Wenge Rong (Beihang University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Identifying sentences sharing similar meanings is crucial to speech and text understandings. Although currently popular cross-encoder solutions with pre-trained language models as backbone have achieved remarkable performance, they suffer from the lack of the permutation invariance or symmetry that is one of the most important inductive biases to such task. To alleviate this issue, in this research we propose a permutation invariant training framework, in which a symmetry regularization is introduced during training that forces the model to produce the same predictions for input sentence pairs in both forward and backward directions. Empirical studies exhibit improved performance over competitive baselines.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00