Generic Dependency Modeling for Multi-Party Conversation
Weizhou Shen (Sun Yat-sen University); Xiaojun Quan (Sun Yat-sen University); Ke Yang (Sun Yat-sen University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
To model the dependencies between utterances in multi-party conversations, we propose a simple and generic framework based on the dependency parsing of utterances. Particularly, we present an approach to encoding the dependencies in the form of relative dependency encoding (ReDE) and illustrate how to implement it in Transformers by modifying the computation of self-attention. Experimental results on four multi-party conversation benchmarks show that this framework successfully boosts the general performance of two Transformer-based language models and leads to comparable or even superior performance compared to the state-of-the-art methods.