Skip to main content

Phrase-level Global-local Hybrid Model for Sentence Embedding

Mingyu Tang, Liansheng Zhuang, Houqiang Li, Jian Yang, Yanqun Guo

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 07:38
09 Jul 2020

Latent structure models have drawn much attention due to the ability to learn an optimal latent hierarchical structures without explicit structure annotations. However, most existing models suffer from high computation complexity and hard training. To this end, this paper proposes a novel phrase-level global-local hybrid model, which inherits the advantages of existing latent structure models while requires less time complexity. Our model splits a sentence into multiple phrases by a category-selection module. Then, it encodes the context dependency by a phrase-level global encoding module, and encodes the task-specific information by a phrase-level local encoding module. Finally, sentence embedding is obtained by integrating the global encoding and task-specific encoding. Experiments on public benchmarks show that, our model achieves state-of-the-art performance on the tasks of sentence classification and natural language inference. Meanwhile, our model is at least 10 times faster than existing state-of-the-art method at the training stage.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00