16 Feb 2017 | Mingsheng Long†, Han Zhu†, Jianmin Wang†, and Michael I. Jordan‡
The paper introduces a novel approach to unsupervised domain adaptation in deep neural networks, focusing on jointly learning adaptive classifiers and transferable features. Unlike previous methods that assume a shared classifier between source and target domains, this work relaxes this assumption by assuming a residual function that connects the two classifiers. The proposed Residual Transfer Network (RTN) leverages deep residual learning to explicitly learn this residual function, enabling better adaptation of the target classifier to the target domain. The method also fuses multiple layers of features using tensor products and embeds them into reproducing kernel Hilbert spaces to match distributions, facilitating feature adaptation. The RTN can be trained efficiently using standard back-propagation and has been shown to outperform state-of-the-art methods on various domain adaptation benchmarks. The paper includes extensive experiments on the Office-31 and Office-Caltech datasets, demonstrating the effectiveness of the proposed approach in both hard and easy transfer tasks.The paper introduces a novel approach to unsupervised domain adaptation in deep neural networks, focusing on jointly learning adaptive classifiers and transferable features. Unlike previous methods that assume a shared classifier between source and target domains, this work relaxes this assumption by assuming a residual function that connects the two classifiers. The proposed Residual Transfer Network (RTN) leverages deep residual learning to explicitly learn this residual function, enabling better adaptation of the target classifier to the target domain. The method also fuses multiple layers of features using tensor products and embeds them into reproducing kernel Hilbert spaces to match distributions, facilitating feature adaptation. The RTN can be trained efficiently using standard back-propagation and has been shown to outperform state-of-the-art methods on various domain adaptation benchmarks. The paper includes extensive experiments on the Office-31 and Office-Caltech datasets, demonstrating the effectiveness of the proposed approach in both hard and easy transfer tasks.