A Viewport-driven Multi-metric Fusion Approach for 360-Degree Video Quality Assessment
Roberto GA Azevedo, Neil Birkbeck, Ivan Janatra, Balu Adsumilli, Pascal Frossard
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 09:41
We propose a new viewport-based multi-metric fusion (MMF) approach for visual quality assessment of 360-degree (omnidirectional) videos. Our method is based on computing multiple spatio-temporal objective quality metrics (features) on viewports extracted from 360-degree videos, and learning a model that combines these features into a metric, which closely matches subjective quality scores. The main motivations for the proposed method are that: 1) quality metrics computed on viewports better captures the user experience than metrics computed on the projection domain; 2) no individual objective image quality metric always performs best for all types of visual distortions, while a learned combination of them is able to adapt to different conditions and produce better results overall. Experimental results, based on the largest available 360-degree videos quality dataset, demonstrate that the proposed metric outperforms state-of-the-art 360-degree and 2D video quality metrics.