Training Deep Spiking Neural Networks Using Backpropagation

Training Deep Spiking Neural Networks Using Backpropagation

08 November 2016 | Jun Haeng Lee, Tobi Delbruck, Michael Pfeiffer
This paper introduces a novel method for training deep spiking neural networks (SNNs) using backpropagation, enabling direct training on spike signals and membrane potentials. The method treats membrane potentials as differentiable signals, with discontinuities at spike times considered as noise. This allows error backpropagation in SNNs similar to conventional deep networks. The approach outperforms previous methods by capturing spike statistics more accurately. The framework was evaluated on MNIST and N-MNIST datasets, achieving a threefold reduction in error rate compared to the best previous SNN and higher accuracy than conventional CNNs. The method supports both fully connected and convolutional SNNs, and achieves equivalent accuracy to conventional networks on MNIST with significantly fewer computational operations. For N-MNIST, the method achieves 98.66% accuracy with 800 hidden units, surpassing previous results. The approach uses weight and threshold regularization to stabilize training, and introduces a novel error normalization technique. The method directly trains SNNs on event streams, improving accuracy by leveraging spatio-temporal patterns in spike data. The results demonstrate the effectiveness of training SNNs directly on event-based data, showing significant improvements in accuracy and efficiency compared to traditional methods.This paper introduces a novel method for training deep spiking neural networks (SNNs) using backpropagation, enabling direct training on spike signals and membrane potentials. The method treats membrane potentials as differentiable signals, with discontinuities at spike times considered as noise. This allows error backpropagation in SNNs similar to conventional deep networks. The approach outperforms previous methods by capturing spike statistics more accurately. The framework was evaluated on MNIST and N-MNIST datasets, achieving a threefold reduction in error rate compared to the best previous SNN and higher accuracy than conventional CNNs. The method supports both fully connected and convolutional SNNs, and achieves equivalent accuracy to conventional networks on MNIST with significantly fewer computational operations. For N-MNIST, the method achieves 98.66% accuracy with 800 hidden units, surpassing previous results. The approach uses weight and threshold regularization to stabilize training, and introduces a novel error normalization technique. The method directly trains SNNs on event streams, improving accuracy by leveraging spatio-temporal patterns in spike data. The results demonstrate the effectiveness of training SNNs directly on event-based data, showing significant improvements in accuracy and efficiency compared to traditional methods.
Reach us at info@study.space