Dgan: Disentangled Representation Learning For Anisotropic Brdf Reconstruction
Zhongyun Hu, Qing Wang, Xue Wang
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 15:04
Accurate reconstruction of real-world materials' appearance from a very limited number of samples is still a huge challenge in computer vision and graphics. In this paper, we present a novel deep architecture, Disentangled Generative Adversarial Network (DGAN), which performs anisotropic Bidirectional Reflectance Distribution Function (BRDF) reconstruction from single BRDF subspace with the maximum entropy. In contrast to previous approaches that directly map known samples to a full BRDF using a CNN, a disentangled representation learning is applied to guide the reconstruction process. In order to learn different physical factors of the BRDF, the generator of the DGAN mainly consists of a fresnel estimator module (FEM) and a directional module (DM). Considering the fact that the entropy of different BRDF subspace varies, we further divide the BRDF into He-BRDF and Le-BRDF to reconstruct the interior part and the exterior part of the directional factor. Experimental results show that our approach outperforms state-of-the-art methods.