A Comprehensive Survey on Transfer Learning

A Comprehensive Survey on Transfer Learning

23 Jun 2020 | Fuzhen Zhuang, Zhiyuan Qi, Keyu Duan, Dongbo Xi, Yongchun Zhu, Hengshu Zhu, Senior Member, IEEE, Hui Xiong, Fellow, IEEE, and Qing He
This survey provides a comprehensive overview of transfer learning, a machine learning technique that improves the performance of target learners on target domains by transferring knowledge from related source domains. The paper reviews over forty representative transfer learning approaches, focusing on homogeneous transfer learning, and discusses their mechanisms and strategies. It also introduces applications of transfer learning and conducts experiments on three datasets (Amazon Reviews, Reuters-21578, and Office-31) to demonstrate the performance of various transfer learning models. The survey highlights the importance of selecting appropriate transfer learning models for different applications and discusses the challenges and opportunities in transfer learning, including the distinction between homogeneous and heterogeneous transfer learning. The paper also addresses the concept of negative transfer, where knowledge from a source domain may negatively affect performance on a target domain. The survey aims to provide readers with a comprehensive understanding of transfer learning from the perspectives of data and model, and to help them select appropriate methods for their specific tasks. The paper also discusses related areas such as semi-supervised learning, multi-view learning, and multi-task learning, and their connections to transfer learning. The survey concludes with a summary of its main contributions, including the introduction of over forty transfer learning approaches and the comparison of their performance through experiments.This survey provides a comprehensive overview of transfer learning, a machine learning technique that improves the performance of target learners on target domains by transferring knowledge from related source domains. The paper reviews over forty representative transfer learning approaches, focusing on homogeneous transfer learning, and discusses their mechanisms and strategies. It also introduces applications of transfer learning and conducts experiments on three datasets (Amazon Reviews, Reuters-21578, and Office-31) to demonstrate the performance of various transfer learning models. The survey highlights the importance of selecting appropriate transfer learning models for different applications and discusses the challenges and opportunities in transfer learning, including the distinction between homogeneous and heterogeneous transfer learning. The paper also addresses the concept of negative transfer, where knowledge from a source domain may negatively affect performance on a target domain. The survey aims to provide readers with a comprehensive understanding of transfer learning from the perspectives of data and model, and to help them select appropriate methods for their specific tasks. The paper also discusses related areas such as semi-supervised learning, multi-view learning, and multi-task learning, and their connections to transfer learning. The survey concludes with a summary of its main contributions, including the introduction of over forty transfer learning approaches and the comparison of their performance through experiments.
Reach us at info@study.space