Skip to main content

Integrating Syntactic and Semantic Knowledge in AMR Parsing with Heterogeneous Graph Attention Network

Yikemaiti Sataer (Southeast University); Chuanqi Shi (Southeast University); Miao Gao (Southeast University); Yunlong Fan (Southeast University); Bin Li (Southeast University); Zhiqiang Gao (Southeast University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
06 Jun 2023

Abstract Meaning Representation (AMR) parsing is the task of translating a sentence to an AMR semantic graph which captures the basic meaning of the sentence, and is empowered by pre-trained Transformer models recently. These models encode the syntactic and semantic knowledge implicitly through self-supervised pretrained tasks. We argue that encoding the syntactic and semantic knowledge explicitly is beneficial to AMR parsing and can improve data efficiency. Specifically, syntactic dependency and semantic role labeling (SRL) have similar sub-structures with AMR. In this work, we propose a novel linguistic knowledge-enhanced AMR parsing model, which augments the pre-trained Transformer with syntactic dependency and semantic role labeling structures of sentences. By applying a heterogeneous graph attention network, we can obtain syntactically and semantically augmented word representation, which can be integrated using an attentive integration layer and gating mechanism. Experimental results show that our model achieves state-of-the-art performance on different benchmarks, especially in out-of-domain and low-resource scenarios.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00