A HARD-TO-BEAT BASELINE FOR TRAINING-FREE CLIP-BASED ADAPTATION

A HARD-TO-BEAT BASELINE FOR TRAINING-FREE CLIP-BASED ADAPTATION

2024 | Zhengbo Wang, Jian Liang, Lijun Sheng, Ran He, Zilei Wang, Tieniu Tan
This paper proposes a training-free adaptation method for CLIP based on Gaussian Discriminant Analysis (GDA). The method leverages the assumption that features from different classes follow Gaussian distributions with identical covariance. By applying Bayes' formula, the classifier can be expressed in terms of the class means and covariance, which can be estimated from the data without the need for training. The method is integrated with the original zero-shot classifier within CLIP to combine knowledge from both visual and textual modalities. The approach is validated on 17 datasets, demonstrating superior performance in few-shot classification, imbalanced learning, and out-of-distribution generalization. The method is also extended to base-to-new generalization and unsupervised learning, where it achieves comparable results to state-of-the-art methods. The code is publicly available at https://github.com/mrflogs/ICLR24.This paper proposes a training-free adaptation method for CLIP based on Gaussian Discriminant Analysis (GDA). The method leverages the assumption that features from different classes follow Gaussian distributions with identical covariance. By applying Bayes' formula, the classifier can be expressed in terms of the class means and covariance, which can be estimated from the data without the need for training. The method is integrated with the original zero-shot classifier within CLIP to combine knowledge from both visual and textual modalities. The approach is validated on 17 datasets, demonstrating superior performance in few-shot classification, imbalanced learning, and out-of-distribution generalization. The method is also extended to base-to-new generalization and unsupervised learning, where it achieves comparable results to state-of-the-art methods. The code is publicly available at https://github.com/mrflogs/ICLR24.
Reach us at info@study.space
[slides] A Hard-to-Beat Baseline for Training-free CLIP-based Adaptation | StudySpace