Unsupervised domain adaptation (UDA) typically carries out knowledge transfer from a label-rich source domain to an unlabeled target domain by adversarial learning. In principle, existing UDA approaches mainly focus on the global distribution alignment between domains while ignoring the intrinsic local distribution properties. Motivated by this observation, we propose an end-to-end structure-conditioned adversarial learning scheme (SCAL) that is able to preserve the intra-class compactness during domain distribution alignment. By using local structures as structure-aware conditions, the proposed scheme is implemented in a structure-conditioned adversarial learning pipeline. The above learning procedure is iteratively performed by alternating between local structures establishment and structure-conditioned adversarial learning. Experimental results demonstrate the effectiveness of the proposed scheme in UDA scenarios.
翻译:未受监督的域适应(UDA)通常通过对抗性学习从标签丰富源域向无标签目标域转移知识,原则上,现有的UDA方法主要侧重于各域间全球分布一致,而忽视了固有的本地分布特性。我们根据这一意见,提议了一个端对端结构有附加条件的对抗性学习计划(SCAL),在域分配一致期间能够保持阶级内部的紧凑性。通过将地方结构作为结构认知条件,拟议方案在结构有限制的对抗性学习管道中实施。以上学习程序通过地方结构的建立和结构有附加条件的对抗性学习的交替进行。实验结果显示UDA情景中拟议方案的有效性。