Symmetric Cross Entropy for Robust Learning with Noisy Labels

Symmetric Cross Entropy for Robust Learning with Noisy Labels

16 Aug 2019 | Yisen Wang, Xingjun Ma, Zaiyi Chen, Yuan Luo, Jinfeng Yi, James Bailey
This paper addresses the challenge of training deep neural networks (DNNs) with noisy labels, a common issue in real-world datasets. The authors identify that while Cross Entropy (CE) loss is widely used, it exhibits overfitting to noisy labels on some classes ("easy" classes) and under-learning on others ("hard" classes). To tackle this, they propose Symmetric Cross Entropy Learning (SL), which combines CE with Reverse Cross Entropy (RCE) to improve robustness and learning performance. SL is designed to address both overfitting and under-learning issues, making it more robust to noisy labels. Theoretical analysis and empirical results on various datasets, including benchmark and real-world datasets like Clothing1M, demonstrate that SL outperforms state-of-the-art methods in terms of robustness and accuracy. The authors also show that SL can be easily integrated into existing models to enhance their performance.This paper addresses the challenge of training deep neural networks (DNNs) with noisy labels, a common issue in real-world datasets. The authors identify that while Cross Entropy (CE) loss is widely used, it exhibits overfitting to noisy labels on some classes ("easy" classes) and under-learning on others ("hard" classes). To tackle this, they propose Symmetric Cross Entropy Learning (SL), which combines CE with Reverse Cross Entropy (RCE) to improve robustness and learning performance. SL is designed to address both overfitting and under-learning issues, making it more robust to noisy labels. Theoretical analysis and empirical results on various datasets, including benchmark and real-world datasets like Clothing1M, demonstrate that SL outperforms state-of-the-art methods in terms of robustness and accuracy. The authors also show that SL can be easily integrated into existing models to enhance their performance.
Reach us at info@study.space