StyleGAN-induced data-driven regularization for inverse problems
Arthur Conmy, Subhadip Mukherjee, Carola-Bibiane Sch�nlieb
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:10:48
Recent advances in generative adversarial networks (GANs) have opened up the possibility of generating high-resolution photo-realistic images that were impossible to produce previously. The ability of GANs to sample from high-dimensional distributions has naturally motivated researchers to leverage their power for modeling the image prior in inverse problems. We extend this line of research by developing a Bayesian image reconstruction framework that utilizes the full potential of a pre-trained StyleGAN2 generator, which is the currently dominant GAN architecture, for constructing the prior distribution on the underlying image. Our proposed approach, which we refer to as \textit{learned Bayesian reconstruction with generative models} (L-BRGM), entails joint optimization over the \textit{style-code} and the input latent code, and enhances the expressive power of a pre-trained StyleGAN2 generator by allowing the style-codes to be different for different generator layers. Considering the inverse problems of image inpainting and super-resolution, we demonstrate that the proposed approach is competitive with, and sometimes superior to, state-of-the-art GAN-based image reconstruction methods.