Domain-Agnostic Mutual Prompting for Unsupervised Domain Adaptation

Domain-Agnostic Mutual Prompting for Unsupervised Domain Adaptation

5 Mar 2024 | Zhekai Du, Xinyao Li, Fengling Li, Ke Lu, Lei Zhu, Jingjing Li
This paper proposes Domain-Agnostic Mutual Prompting (DAMP), a novel framework for Unsupervised Domain Adaptation (UDA) that leverages large-scale pre-trained Vision-Language Models (VLMs) to learn domain-agnostic prompts. Traditional UDA methods focus on minimizing distribution discrepancy between domains, but they often neglect rich semantics and struggle with complex domain shifts. DAMP addresses these limitations by mutually aligning visual and textual embeddings through cross-attention mechanisms, enabling domain-invariant representations. The framework learns both textual and visual prompts in a mutual learning setup, with regularizations to ensure domain-agnostic and instance-conditioned information. DAMP outperforms state-of-the-art methods on three UDA benchmarks, demonstrating superior performance in terms of classification accuracy. The method is evaluated on widely used datasets such as Office-Home, VisDA-17, and Mini-DomainNet, showing consistent improvements over existing approaches. The proposed framework provides an effective way to harness both source and pre-trained VLM knowledge for UDA.This paper proposes Domain-Agnostic Mutual Prompting (DAMP), a novel framework for Unsupervised Domain Adaptation (UDA) that leverages large-scale pre-trained Vision-Language Models (VLMs) to learn domain-agnostic prompts. Traditional UDA methods focus on minimizing distribution discrepancy between domains, but they often neglect rich semantics and struggle with complex domain shifts. DAMP addresses these limitations by mutually aligning visual and textual embeddings through cross-attention mechanisms, enabling domain-invariant representations. The framework learns both textual and visual prompts in a mutual learning setup, with regularizations to ensure domain-agnostic and instance-conditioned information. DAMP outperforms state-of-the-art methods on three UDA benchmarks, demonstrating superior performance in terms of classification accuracy. The method is evaluated on widely used datasets such as Office-Home, VisDA-17, and Mini-DomainNet, showing consistent improvements over existing approaches. The proposed framework provides an effective way to harness both source and pre-trained VLM knowledge for UDA.
Reach us at info@study.space
[slides] Domain-Agnostic Mutual Prompting for Unsupervised Domain Adaptation | StudySpace