Event-based High Frame-rate Video Reconstruction with a Novel Cycle-Event Network
Binyi Su, Lei Yu, Wen Yang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 07:45
Event-to-image translation is a popular problem where the goal is to obtain a mapping from an input event stream to an output intensity image using a set of aligned image pairs for training. However, due to the high temporal resolution of the event camera, the alignment of the ground truth to the events is difficult to acquire. In this paper, we firstly propose an enhanced Cycle-Consistency Generative Adversarial Networks (enhanced Cycle-GAN), called Cycle-Event Network, where paired data is not required for the training phase. Besides, noises from event cameras can severely contaminate the data quality and makes the reconstruction an ill-posed problem. In order to generate high frame-rate video from events with less noisy background and richer texture details, a novel attention mechanism (Residual Channel-wise Attention Gate) is then proposed to reweight the feature of the generator in Cycle-Event Network. The qualitative results are presented on several datasets, and the quantitative comparisons clearly demonstrate the effectiveness of our proposed Cycle-Event Networks.