BinaryConnect: Training Deep Neural Networks with binary weights during propagations

BinaryConnect: Training Deep Neural Networks with binary weights during propagations

18 Apr 2016 | Matthieu Courbariaux, Yoshua Bengio, Jean-Pierre David
The paper "BinaryConnect: Training Deep Neural Networks with Binary Weights during Propagations" by Matthieu Courbariaux introduces a method called BinaryConnect, which trains deep neural networks (DNNs) using binary weights during both forward and backward propagations. This approach aims to reduce computational complexity and power consumption, making it suitable for low-power devices and specialized hardware for deep learning (DL). BinaryConnect replaces many multiply-accumulate operations with simple additions, significantly reducing the number of operations required. The method retains precision in the stored weights for gradient accumulation, similar to dropout, which acts as a regularizer. The authors demonstrate that BinaryConnect achieves near state-of-the-art results on permutation-invariant datasets such as MNIST, CIFAR-10, and SVHN. The paper also discusses the benefits of stochastic binarization and the importance of clipping weights to prevent them from exceeding the binary values. The code for BinaryConnect is made available, and the authors explore its effectiveness in various benchmarks, showing that it can be a powerful regularizer and potentially enable significant speed-ups in training and inference.The paper "BinaryConnect: Training Deep Neural Networks with Binary Weights during Propagations" by Matthieu Courbariaux introduces a method called BinaryConnect, which trains deep neural networks (DNNs) using binary weights during both forward and backward propagations. This approach aims to reduce computational complexity and power consumption, making it suitable for low-power devices and specialized hardware for deep learning (DL). BinaryConnect replaces many multiply-accumulate operations with simple additions, significantly reducing the number of operations required. The method retains precision in the stored weights for gradient accumulation, similar to dropout, which acts as a regularizer. The authors demonstrate that BinaryConnect achieves near state-of-the-art results on permutation-invariant datasets such as MNIST, CIFAR-10, and SVHN. The paper also discusses the benefits of stochastic binarization and the importance of clipping weights to prevent them from exceeding the binary values. The code for BinaryConnect is made available, and the authors explore its effectiveness in various benchmarks, showing that it can be a powerful regularizer and potentially enable significant speed-ups in training and inference.
Reach us at info@study.space
[slides] BinaryConnect%3A Training Deep Neural Networks with binary weights during propagations | StudySpace