Transfer Learning for Nonparametric Regression: Non-asymptotic Minimax Analysis and Adaptive Procedure

Transfer Learning for Nonparametric Regression: Non-asymptotic Minimax Analysis and Adaptive Procedure

22 Jan 2024 | T. Tony Cai and Hongming Pu
This paper explores transfer learning for nonparametric regression, focusing on the posterior drift model where the difference between the mean functions in the target and source domains can be approximated by a polynomial. The authors establish the minimax optimal rate of convergence for this problem and propose a novel confidence thresholding estimator that achieves the minimax optimal risk up to a logarithmic factor. They also develop an adaptive algorithm that adaptively achieves the minimax risk over a wide range of parameter spaces without requiring knowledge of the smoothness parameters. Simulation studies and a real-world example demonstrate the effectiveness of the proposed methods. The paper highlights two unique phenomena: auto-smoothing, where the minimax rate is independent of the smoother function, and super-acceleration, where the minimax risk for transfer learning is smaller than the risks for estimating the functions separately. The authors also extend their analysis to multiple source domains, showing that their methods can be generalized to handle multiple source distributions.This paper explores transfer learning for nonparametric regression, focusing on the posterior drift model where the difference between the mean functions in the target and source domains can be approximated by a polynomial. The authors establish the minimax optimal rate of convergence for this problem and propose a novel confidence thresholding estimator that achieves the minimax optimal risk up to a logarithmic factor. They also develop an adaptive algorithm that adaptively achieves the minimax risk over a wide range of parameter spaces without requiring knowledge of the smoothness parameters. Simulation studies and a real-world example demonstrate the effectiveness of the proposed methods. The paper highlights two unique phenomena: auto-smoothing, where the minimax rate is independent of the smoother function, and super-acceleration, where the minimax risk for transfer learning is smaller than the risks for estimating the functions separately. The authors also extend their analysis to multiple source domains, showing that their methods can be generalized to handle multiple source distributions.
Reach us at info@study.space
[slides] Transfer Learning for Nonparametric Regression%3A Non-asymptotic Minimax Analysis and Adaptive Procedure | StudySpace