01 May 2024 | Fanfan Li, Dingwei Li, Chuanqing Wang, Guole Liu, Rui Wang, Huihui Ren, Yingjie Tang, Yan Wang, Yitong Chen, Kun Liang, Qi Huang, Mohamad Sawan, Min Qiu, Hong Wang & Bowen Zhu
This study presents an artificial visual spiking neuron that enables multiplexed rate and time-to-first-spike (TTFS) coding of external visual information. The neuron integrates an In₂O₃ synaptic phototransistor and an NbOₓ Mott memristor, mimicking biological photoreceptors and retinal ganglion neurons. The artificial neuron can encode visual information using both rate coding (spiking frequency) and TTFS coding (spike latency), achieving high spiking frequencies (up to 1.85 MHz) and short first-spike latencies (down to 1.04 μs). This dual-coding scheme enhances the neuron's ability to process complex visual information efficiently and accurately, with low energy consumption (less than 1.06 nJ per spike) and high endurance (over 10¹⁰ cycles). The neuron's performance is validated through a spiking neural network (SNN) that achieves high accuracy in predicting steering angles and speeds for autonomous vehicles in complex environments. The SNN with the proposed rate-temporal fusion (RTF) encoding scheme demonstrates superior performance compared to rate or TTFS coding alone, with a loss function of less than 0.5. The study highlights the potential of biologically plausible neuromorphic hardware for efficient visual processing and real-time decision-making in autonomous systems.This study presents an artificial visual spiking neuron that enables multiplexed rate and time-to-first-spike (TTFS) coding of external visual information. The neuron integrates an In₂O₃ synaptic phototransistor and an NbOₓ Mott memristor, mimicking biological photoreceptors and retinal ganglion neurons. The artificial neuron can encode visual information using both rate coding (spiking frequency) and TTFS coding (spike latency), achieving high spiking frequencies (up to 1.85 MHz) and short first-spike latencies (down to 1.04 μs). This dual-coding scheme enhances the neuron's ability to process complex visual information efficiently and accurately, with low energy consumption (less than 1.06 nJ per spike) and high endurance (over 10¹⁰ cycles). The neuron's performance is validated through a spiking neural network (SNN) that achieves high accuracy in predicting steering angles and speeds for autonomous vehicles in complex environments. The SNN with the proposed rate-temporal fusion (RTF) encoding scheme demonstrates superior performance compared to rate or TTFS coding alone, with a loss function of less than 0.5. The study highlights the potential of biologically plausible neuromorphic hardware for efficient visual processing and real-time decision-making in autonomous systems.