SkillNet-NLG: General-Purpose Natural Language Generation with a Sparsely Activated Approach
Junwei Liao (University of Electronic Science and Technology of China); Duyu Tang (Tencent); Fan Zhang (Tianjin University); Shuming Shi (Tsinghua University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
We present SkillNet-NLG, a sparsely activated approach that handles many natural language generation tasks with one model. Different from traditional dense models that always activate all the parameters, SkillNet-NLG selectively activates relevant parts of the parameters to accomplish a task, where the relevance is controlled by a set of predefined skills. The strength of such model design is that it provides an opportunity to precisely adapt relevant skills to learn new tasks effectively. We evaluate on Chinese natural language generation tasks. Results show that, with only one model file, SkillNet-NLG outperforms previous best performance methods on four of five tasks. SkillNet-NLG performs better than two multitask learning baselines (a dense model and a Mixture-of-Expert model) and achieves comparable performance to task-specific models. Lastly, SkillNet-NLG surpasses baseline systems when being adapted to new tasks.