Binarized Neural Networks: Training Neural Networks with Weights and Activations Constrained to +1 or -1

Binarized Neural Networks: Training Neural Networks with Weights and Activations Constrained to +1 or -1

17 Mar 2016 | Matthieu Courbariaux, Itay Hubara, Daniel Soudry, Ran El-Yaniv, Yoshua Bengio
This paper introduces Binarized Neural Networks (BNNs), which are neural networks with binary weights and activations at both training and runtime. The authors detail the binarization functions, gradient computation, and backpropagation through discretization. They conduct experiments on the MNIST, CIFAR-10, and SVHN datasets using the Torch7 and Theano frameworks, achieving nearly state-of-the-art results. BNNs significantly reduce memory consumption and arithmetic operations, leading to improved power efficiency. Additionally, they demonstrate a 7x speedup on GPU for MNIST classification with a binary matrix multiplication kernel. The paper also discusses related work and future directions, including extending speed-ups to training and applying BNNs to other models and datasets.This paper introduces Binarized Neural Networks (BNNs), which are neural networks with binary weights and activations at both training and runtime. The authors detail the binarization functions, gradient computation, and backpropagation through discretization. They conduct experiments on the MNIST, CIFAR-10, and SVHN datasets using the Torch7 and Theano frameworks, achieving nearly state-of-the-art results. BNNs significantly reduce memory consumption and arithmetic operations, leading to improved power efficiency. Additionally, they demonstrate a 7x speedup on GPU for MNIST classification with a binary matrix multiplication kernel. The paper also discusses related work and future directions, including extending speed-ups to training and applying BNNs to other models and datasets.
Reach us at info@study.space