Exploring simple triplet representation learning

Exploring simple triplet representation learning

2024 | Zeyu Ren, Quan Lan, Yudong Zhang, Shuihua Wang
This paper introduces SimTrip, a novel self-supervised learning method designed to efficiently extract meaningful representations from unlabelled data, particularly in the context of medical image analysis. The method leverages a triple-view architecture and a novel loss function, TriLoss, to learn from unlabelled images with small batch sizes and reduced computational power. SimTrip aims to address the challenges of model collapse, large computational requirements, and high batch sizes common in other self-supervised methods. The evaluation of SimTrip on two medical image datasets, Acute Lymphoblastic Leukemia (ALL) and Lung Cancer 25000 (LC25000), demonstrates its superior performance compared to state-of-the-art methods, including fully supervised, semi-supervised, and unsupervised approaches. SimTrip achieves high precision, recall, F1-score, and accuracy across different labelled ratios and batch sizes, showcasing its robustness and efficiency. The method's effectiveness is further validated through an ablation study, which highlights the importance of image augmentation, projection MLP, and prediction MLP in SimTrip's performance. Despite some limitations, such as the need for more complex proxy tasks and the inability to handle transfer learning tasks with large datasets, SimTrip offers a promising approach for unsupervised representation learning, especially in scenarios with limited labelled data.This paper introduces SimTrip, a novel self-supervised learning method designed to efficiently extract meaningful representations from unlabelled data, particularly in the context of medical image analysis. The method leverages a triple-view architecture and a novel loss function, TriLoss, to learn from unlabelled images with small batch sizes and reduced computational power. SimTrip aims to address the challenges of model collapse, large computational requirements, and high batch sizes common in other self-supervised methods. The evaluation of SimTrip on two medical image datasets, Acute Lymphoblastic Leukemia (ALL) and Lung Cancer 25000 (LC25000), demonstrates its superior performance compared to state-of-the-art methods, including fully supervised, semi-supervised, and unsupervised approaches. SimTrip achieves high precision, recall, F1-score, and accuracy across different labelled ratios and batch sizes, showcasing its robustness and efficiency. The method's effectiveness is further validated through an ablation study, which highlights the importance of image augmentation, projection MLP, and prediction MLP in SimTrip's performance. Despite some limitations, such as the need for more complex proxy tasks and the inability to handle transfer learning tasks with large datasets, SimTrip offers a promising approach for unsupervised representation learning, especially in scenarios with limited labelled data.
Reach us at info@study.space
[slides and audio] Exploring simple triplet representation learning