Error-Backpropagation in Temporally Encoded Networks of Spiking Neurons

Error-Backpropagation in Temporally Encoded Networks of Spiking Neurons

2000 | Sander M. Bohte¹, Han A. La Poutre¹, Joost N. Kok¹,²
This paper presents SpikeProp, a supervised learning algorithm for spiking neural networks that enables temporal coding of information in spike times. The algorithm is derived from traditional error-backpropagation and addresses the discontinuities introduced by thresholding in spiking neurons. The method allows spiking neurons to perform complex non-linear classification tasks as effectively as rate-coded networks, using significantly fewer neurons. The algorithm is tested on the XOR problem and other benchmark datasets, demonstrating that temporal coding requires fewer neurons than rate coding. The results show that spiking neurons with biologically reasonable action potentials can perform complex non-linear classification in fast temporal coding. The algorithm is also shown to be effective for real-world datasets, including the Iris, Wisconsin breast cancer, and Landsat datasets. The study supports theoretical predictions that the rising segment of the post-synaptic potential must be longer than the relevant temporal structure for reliable temporal computation. The results also highlight the computational advantages of spiking neurons over sigmoidal neurons, particularly in terms of efficiency and accuracy. The paper concludes that spiking neurons are more suitable for learning and evaluating temporal patterns than sigmoidal networks, and that temporal coding can be more efficient than rate coding in terms of the number of neurons required. The study also discusses the biological implications of temporal coding, suggesting that it is actively used in the cortex for information processing.This paper presents SpikeProp, a supervised learning algorithm for spiking neural networks that enables temporal coding of information in spike times. The algorithm is derived from traditional error-backpropagation and addresses the discontinuities introduced by thresholding in spiking neurons. The method allows spiking neurons to perform complex non-linear classification tasks as effectively as rate-coded networks, using significantly fewer neurons. The algorithm is tested on the XOR problem and other benchmark datasets, demonstrating that temporal coding requires fewer neurons than rate coding. The results show that spiking neurons with biologically reasonable action potentials can perform complex non-linear classification in fast temporal coding. The algorithm is also shown to be effective for real-world datasets, including the Iris, Wisconsin breast cancer, and Landsat datasets. The study supports theoretical predictions that the rising segment of the post-synaptic potential must be longer than the relevant temporal structure for reliable temporal computation. The results also highlight the computational advantages of spiking neurons over sigmoidal neurons, particularly in terms of efficiency and accuracy. The paper concludes that spiking neurons are more suitable for learning and evaluating temporal patterns than sigmoidal networks, and that temporal coding can be more efficient than rate coding in terms of the number of neurons required. The study also discusses the biological implications of temporal coding, suggesting that it is actively used in the cortex for information processing.
Reach us at info@study.space
Understanding Error-backpropagation in temporally encoded networks of spiking neurons