Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning

Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning

14 Jun 2016 | Mehdi Sajjadi, Mehran Javanmardi, Tolga Tasdizen
This paper addresses the challenge of semi-supervised learning with convolutional neural networks (ConvNets), which often require large labeled datasets for effective training. The authors propose an unsupervised loss function that leverages the stochastic nature of techniques like randomized data augmentation, dropout, and random max-pooling to minimize the differences in predictions when a sample passes through the network multiple times. This approach enhances the stability and generalization of the model, especially when labeled data is limited. The proposed loss function can be combined with any supervised loss function and is evaluated on several benchmark datasets, including MNIST, CIFAR10, CIFAR100, SVHN, NORB, and ImageNet. The results demonstrate significant improvements in accuracy, particularly when a small number of labeled samples are available. The combination of the transformation/stability loss function with the mutual-exclusivity loss function further enhances performance. The paper concludes that the proposed method effectively improves the accuracy of ConvNets, even with limited labeled data.This paper addresses the challenge of semi-supervised learning with convolutional neural networks (ConvNets), which often require large labeled datasets for effective training. The authors propose an unsupervised loss function that leverages the stochastic nature of techniques like randomized data augmentation, dropout, and random max-pooling to minimize the differences in predictions when a sample passes through the network multiple times. This approach enhances the stability and generalization of the model, especially when labeled data is limited. The proposed loss function can be combined with any supervised loss function and is evaluated on several benchmark datasets, including MNIST, CIFAR10, CIFAR100, SVHN, NORB, and ImageNet. The results demonstrate significant improvements in accuracy, particularly when a small number of labeled samples are available. The combination of the transformation/stability loss function with the mutual-exclusivity loss function further enhances performance. The paper concludes that the proposed method effectively improves the accuracy of ConvNets, even with limited labeled data.
Reach us at info@study.space
[slides] Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning | StudySpace