Relational Representation Learning for Zero-shot Relation Extraction with Instance Prompting and Prototype Rectification
Bin Duan (Beijing University of Posts and Telecommunications); Xingxian Liu (Beijing University of Posts and Telecommunications); Shusen Wang (Beijing University of Posts and Telecommunications); Yajing Xu (Beijing University of Posts and Telecommunications); Bo Xiao (Beijing University of Posts and Telecommunications)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Zero-shot relation extraction aims to extract novel relations that are not observed beforehand. However, existing representation methods are not pre-trained for relational representations and embeddings contain much linguistic information, the distances between them are not consistent with relational semantic similarity. In this paper, we propose a novel method based on Instance Prompting and Prototype Rectification (IPPR) to conduct relational representation learning for zero-shot relation extraction. Instance prompting is designed to reduce the gap between pre-training and fine-tuning, and guide the pre-trained model to generate relation-oriented instance representations. Prototype rectification aims to push the prototype embeddings away from each other and makes the instance embeddings closer to its corresponding prototype embeddings for dynamically rectifying the prototype embeddings. Experimental results on two public datasets demonstrate that our proposed method achieves new state-of-the-arts performance.