Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 15:04
04 May 2020

Accurate reconstruction of real-world materials' appearance from a very limited number of samples is still a huge challenge in computer vision and graphics. In this paper, we present a novel deep architecture, Disentangled Generative Adversarial Network (DGAN), which performs anisotropic Bidirectional Reflectance Distribution Function (BRDF) reconstruction from single BRDF subspace with the maximum entropy. In contrast to previous approaches that directly map known samples to a full BRDF using a CNN, a disentangled representation learning is applied to guide the reconstruction process. In order to learn different physical factors of the BRDF, the generator of the DGAN mainly consists of a fresnel estimator module (FEM) and a directional module (DM). Considering the fact that the entropy of different BRDF subspace varies, we further divide the BRDF into He-BRDF and Le-BRDF to reconstruct the interior part and the exterior part of the directional factor. Experimental results show that our approach outperforms state-of-the-art methods.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00