10 Mar 2022 | Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, Jae-Gil Lee
This survey paper focuses on the challenge of learning from noisy labels in deep neural networks (DNNs). It highlights the significant impact of noisy labels on the generalization performance of DNNs and reviews 62 state-of-the-art robust training methods, categorizing them into five groups based on their methodological differences. The paper also provides a systematic comparison of six evaluation metrics used to assess the effectiveness of these methods. Additionally, it discusses noise rate estimation and evaluation methodologies, including public noisy datasets and metrics. Finally, the paper outlines several promising research directions for future studies, emphasizing the importance of robust training in modern deep learning applications. The content is available at <https://github.com/songhwanjun/Awesome-Noisy-Labels>.This survey paper focuses on the challenge of learning from noisy labels in deep neural networks (DNNs). It highlights the significant impact of noisy labels on the generalization performance of DNNs and reviews 62 state-of-the-art robust training methods, categorizing them into five groups based on their methodological differences. The paper also provides a systematic comparison of six evaluation metrics used to assess the effectiveness of these methods. Additionally, it discusses noise rate estimation and evaluation methodologies, including public noisy datasets and metrics. Finally, the paper outlines several promising research directions for future studies, emphasizing the importance of robust training in modern deep learning applications. The content is available at <https://github.com/songhwanjun/Awesome-Noisy-Labels>.