Skip to main content

No-Reference Video Quality Assessment Using Space-Time Chips

Joshua Peter Ebenezer, Zaixi Shang, Yongjun Wu, Hai Wei, Alan Conrad Bovik

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 10:43
21 Sep 2020

We propose a new model for no-reference video quality assessment (VQA) based on the natural statistics of space-time chips of videos. Space-time chips (ST-chips) are a new, quality-aware feature space which we define as space-time localized cuts of video data in directions that are determined by the local motion flow. We use parametrized statistical fits to the statistics of space-time chips to characterize quality, and show that the parameters from these models are affected by distortion and can hence be used to objectively predict the quality of videos. The proposed method, which we tentatively call ChipQA, is agnostic to the types of distortion affecting the video, and is based on identifying and quantifying deviations from the expected statistics of natural, undistorted ST-chips in order to predict video quality. We train and test our resulting model on several large VQA databases and show that our model achieves high correlation against human judgments of video quality and is competitive with state-of-the-art models.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00