MSNet: A Deep Architecture using Multi-Sentiment Semantics for Sentiment-Aware Image Style Transfer
Shikun Sun (Tsinghua University); Jia Jia (Tsinghua University); Haozhe Wu (Tsinghua University); Zijie Ye (Tsinghua University); Junliang Xing (Tsinghua University)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Sentiment plays an essential role in people’s perception of images. To incorporate the sentiment information into the image style transfer task for better sentiment-aware performance, we introduce a new task named sentiment-aware image style transfer. To solve this problem, we first introduce a novel Multi-Sentiment Semantics Space (MSS-Space) to capture the non-deterministic and complicated nature of sentiment semantics. With the MSS-Space, we establish tight associations between the visual attributes of images and the multi-sentiment semantics by minimizing their distance in MSS-Space and then propose the Multi-Sentiment Style Transfer Net (MSNet). Experiments demonstrate that, compared with three competing models, our proposed MSNet generates more explicit images and better preserves the integrity of salient objects, local details, and multi-sentiment. In particular, our model outperforms the state-of-the-art by +28.72\% in terms of the top-3 accuracy on average.