Single-Shot Domain Adaptation via Target-Aware Generative Augmentations
Rakshith Subramanyam (Arizona State University); Kowshik Thopalli (Arizona State University); Spring Berman (Arizona State University, USA); Pavan Turaga (Arizona State University); Jayaraman J. Thiagarajan (Lawrence Livermore National Laboratory)
-
SPS
IEEE Members: $11.00
Non-members: $15.00
The problem of adapting models from a source domain using data from any target domain of interest has gained prominence, thanks to the brittle generalization in deep neural networks. While several test-time adaptation techniques have emerged, they typically rely on synthetic data augmentations in cases of limited target data availability. In this paper, we consider the challenging setting of single-shot adaptation and explore the design of augmentation strategies. We argue that augmentations utilized by existing methods are insufficient to handle large distribution shifts, and hence propose a new approach SiSTA (Single-Shot Target Augmentations), which first fine-tunes a generative model from the source domain using a single-shot target, and then employs novel sampling strategies for curating synthetic target data. Using experiments with a state-of-the-art domain adaptation method, we find that SiSTA produces improvements as high as 20% over existing baselines under challenging shifts in face attribute detection, and that it performs competitively to oracle models obtained by training on a larger target dataset.