Symmetric Cross Entropy for Robust Learning with Noisy Labels

Symmetric Cross Entropy for Robust Learning with Noisy Labels

16 Aug 2019 | Yisen Wang, Xingjun Ma, Zaiyi Chen, Yuan Luo, Jinfeng Yi, James Bailey
This paper introduces Symmetric Cross Entropy Learning (SL), a novel approach to train deep neural networks (DNNs) in the presence of noisy labels. The authors highlight that traditional Cross Entropy (CE) loss suffers from two main issues: overfitting to noisy labels on "easy" classes and underlearning on "hard" classes. To address these problems, SL combines CE with Reverse Cross Entropy (RCE), a noise-tolerant loss that promotes robust learning. Theoretical analysis shows that RCE is robust to symmetric and uniform label noise, and empirical results demonstrate that SL outperforms state-of-the-art methods on various benchmark and real-world datasets. SL is also easy to incorporate into existing models to enhance their performance. The paper provides a comprehensive analysis of the learning dynamics of DNNs with noisy labels, showing that SL effectively balances sufficient learning and robustness to label noise. The proposed approach is simple to implement and has shown promising results in improving the performance of DNNs under noisy label conditions.This paper introduces Symmetric Cross Entropy Learning (SL), a novel approach to train deep neural networks (DNNs) in the presence of noisy labels. The authors highlight that traditional Cross Entropy (CE) loss suffers from two main issues: overfitting to noisy labels on "easy" classes and underlearning on "hard" classes. To address these problems, SL combines CE with Reverse Cross Entropy (RCE), a noise-tolerant loss that promotes robust learning. Theoretical analysis shows that RCE is robust to symmetric and uniform label noise, and empirical results demonstrate that SL outperforms state-of-the-art methods on various benchmark and real-world datasets. SL is also easy to incorporate into existing models to enhance their performance. The paper provides a comprehensive analysis of the learning dynamics of DNNs with noisy labels, showing that SL effectively balances sufficient learning and robustness to label noise. The proposed approach is simple to implement and has shown promising results in improving the performance of DNNs under noisy label conditions.
Reach us at info@study.space
Understanding Symmetric Cross Entropy for Robust Learning With Noisy Labels