Towards Better Graph Representation: Two-Branch Collaborative Graph Neural Networks for Multimodal Marketing Intention Detection
Lu Zhang, Jian Zhang, Zhibin Li, JingSong Xu
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 07:23
Inspired by the fact that spreading and collecting information through the Internet becomes the norm, more and more people choose to post for-profit contents (images and texts) in social networks. Due to the difficulty of network censors, malicious marketing may be capable of harming the society. Therefore, it is meaningful to detect marketing intentions online automatically. However, gaps between multimodal data make it difficult to fuse images and texts for content marketing detection. To this end, this paper proposes Two-Branch Collaborative Graph Neural Networks to collaboratively represent multimodal data by Graph Convolution Networks (GCNs) in an end-to-end fashion. We first separately embed groups of images and texts by GCNs layers from two views and further adopt the proposed multimodal fusion strategy to learn the graph representation collaboratively. Experimental results demonstrate that our proposed method achieves superior graph classification performance for marketing intention detection.