24 Nov 2015 | Antti Rasmus, Harri Valpola, Mikko Honkala, Mathias Berglund, Tapani Raiko
The paper introduces a semi-supervised learning method that combines supervised and unsupervised learning in deep neural networks. The proposed model, called the Ladder Network, is trained to minimize the sum of supervised and unsupervised cost functions simultaneously, without the need for layer-wise pre-training. The unsupervised part focuses on relevant details found by supervised learning and can be added to existing feedforward neural networks. The key aspects of the approach include compatibility with supervised methods, scalability resulting from local learning, and computational efficiency. The Ladder Network is shown to achieve state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification tasks, as well as permutation-invariant MNIST classification with all labels. The implementation details, including the encoder and decoder structures, are provided, along with experimental results demonstrating the effectiveness of the proposed method.The paper introduces a semi-supervised learning method that combines supervised and unsupervised learning in deep neural networks. The proposed model, called the Ladder Network, is trained to minimize the sum of supervised and unsupervised cost functions simultaneously, without the need for layer-wise pre-training. The unsupervised part focuses on relevant details found by supervised learning and can be added to existing feedforward neural networks. The key aspects of the approach include compatibility with supervised methods, scalability resulting from local learning, and computational efficiency. The Ladder Network is shown to achieve state-of-the-art performance in semi-supervised MNIST and CIFAR-10 classification tasks, as well as permutation-invariant MNIST classification with all labels. The implementation details, including the encoder and decoder structures, are provided, along with experimental results demonstrating the effectiveness of the proposed method.