AMU-Tuning: Effective Logit Bias for CLIP-based Few-shot Learning

AMU-Tuning: Effective Logit Bias for CLIP-based Few-shot Learning

13 Apr 2024 | Yuwei Tang*, Zhenyi Lin*, Qilong Wang†, Pengfei Zhu, Qinghua Hu
AMU-Tuning: Effective Logit Bias for CLIP-based Few-shot Learning This paper proposes a novel method, AMU-Tuning, to enhance the performance of CLIP-based few-shot learning by effectively learning logit bias. The method introduces a unified formulation to analyze CLIP-based few-shot learning from the perspective of logit bias, encouraging the learning of effective logit bias for improved performance. The key components involved in logit bias computation are disassembled and analyzed: logit features, logit predictor, and logit fusion. The paper proposes an efficient AMU-Tuning method that leverages appropriate auxiliary features to compute logit bias, employs a feature-initialized linear classifier with multi-branch training to improve the logit predictor, and develops an uncertainty-based fusion to adaptively incorporate logit bias into CLIP for few-shot classification. The method is evaluated on several downstream tasks and out-of-distribution benchmarks, demonstrating that AMU-Tuning clearly outperforms existing methods while achieving state-of-the-art performance in CLIP-based few-shot learning. The results show that the method is effective in learning logit bias, which significantly improves the performance of few-shot classification. The analysis of the key components of logit bias highlights the importance of feature initialization and the complementarity of auxiliary features in improving the performance of few-shot classification. The method is also robust to out-of-distribution settings, showing strong generalization capabilities. The paper concludes that the analysis from the perspective of logit bias provides new insights into CLIP-based few-shot learning and encourages the development of more effective logit bias learning methods.AMU-Tuning: Effective Logit Bias for CLIP-based Few-shot Learning This paper proposes a novel method, AMU-Tuning, to enhance the performance of CLIP-based few-shot learning by effectively learning logit bias. The method introduces a unified formulation to analyze CLIP-based few-shot learning from the perspective of logit bias, encouraging the learning of effective logit bias for improved performance. The key components involved in logit bias computation are disassembled and analyzed: logit features, logit predictor, and logit fusion. The paper proposes an efficient AMU-Tuning method that leverages appropriate auxiliary features to compute logit bias, employs a feature-initialized linear classifier with multi-branch training to improve the logit predictor, and develops an uncertainty-based fusion to adaptively incorporate logit bias into CLIP for few-shot classification. The method is evaluated on several downstream tasks and out-of-distribution benchmarks, demonstrating that AMU-Tuning clearly outperforms existing methods while achieving state-of-the-art performance in CLIP-based few-shot learning. The results show that the method is effective in learning logit bias, which significantly improves the performance of few-shot classification. The analysis of the key components of logit bias highlights the importance of feature initialization and the complementarity of auxiliary features in improving the performance of few-shot classification. The method is also robust to out-of-distribution settings, showing strong generalization capabilities. The paper concludes that the analysis from the perspective of logit bias provides new insights into CLIP-based few-shot learning and encourages the development of more effective logit bias learning methods.
Reach us at info@study.space
[slides] AMU-Tuning%3A Effective Logit Bias for CLIP-based Few-shot Learning | StudySpace