Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:14:03
19 Oct 2022

Temporal color constancy (TCC) is a recent research direction that uses multiple temporally correlated images (e.g., frames in a video) to perform illuminant estimation. Compared to traditional single-frame color constancy methods, temporal or multi-frame methods can leverage the additional temporal information inherent in sequences, which makes them naturally suitable for processing videos. in this paper, we present a hybrid framework called Fusion Temporal Color Constancy that combines classical color constancy algorithms, convolutional neural networks (CNN), and recurrent neural networks (RNN) for the task of TCC. Experimental results show that our hybrid approach achieves state-of-the-art accuracy on several publicly available multi-frame and single-frame color constancy benchmarks.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00