Convolutional Neural Network Based In-Loop Filter For Vvc Intra Coding
Yue Li, Li Zhang, Kai Zhang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:06:32
In the emerging Versatile Video Coding (VVC) standard, there are three in-loop filters known as deblocking, sam- ple adaptive offset (SAO), and adaptive loop filter (ALF) for suppressing compression artifacts as well as reduc- ing distortion. However, those handcrafted filters are insuܻ?icient to deal with the complicated compression ar- tifacts. Deep learning-based filtering has demonstrated overwhelming successes in the field of image restoration. In this paper, we propose a convolutional neural network- based filter for enhancing the quality of VVC intra coded frames. The proposed filter takes auxiliary information including partitioning and prediction information as in- put. For chroma, auxiliary information further includes luma samples. Regarding the training, we find that data augmentation and loss function selection are not triv- ial. We conduct extensive experiments to evaluate the effectiveness of each design. When testing on top of VTM-10.0 under all intra configuration, the proposed fil- ter achieves state-of-the-art performance, i.e. on average 7.57%, 13.18%, and 12.50% BD-rate reductions for Y, Cb, and Cr, respectively. The proposed filter ranks first among neural network-based in-loop filters at the 20th meeting of the Joint Video Exploration Team (JVET).