28 May 2024 | Botao He*1,2,†, Ze Wang3,4, Yuan Zhou2,3, Jingxi Chen1, Chahat Deep Singh1, Haojia Li5, Yuman Gao2,3, Shaojie Shen5, Kaiwei Wang4, Yanjun Cao3, Chao Xu2,3, Yiannis Aloimonos1,6,7, Fei Gao*2,3, and Cornelia Fermüller*1,6,7
This paper proposes an artificial microsaccade-enhanced event camera (AMI-EV) for robotics, inspired by human microsaccades to maintain stable texture and high informational output. Event cameras, which respond to motion, struggle with texture stability and persistent information due to their sensor limitations. AMI-EV uses a rotating wedge prism to redirect light and trigger events, enabling stable texture appearance and high informational output independent of external motion. The system integrates hardware and software to maintain texture stability and improve data quality in scenarios where standard and event cameras fail. The AMI-EV system was validated through various experiments, demonstrating its effectiveness in low- and high-level vision tasks. The system's design allows for continuous light steering, improving efficiency and enabling the capture of events at all scene edges. The AMI-EV system also includes a compensation algorithm to account for motion, ensuring compatibility with existing event-based algorithms. The system's performance was evaluated in texture enhancement, feature detection, motion segmentation, and human detection tasks, showing superior results compared to standard cameras and event cameras. The AMI-EV system also includes a simulator and translator for generating synthetic data, facilitating future research. The system's hardware and software design enable robust performance in challenging environments, with potential applications in various robotics tasks. The proposed solution addresses the limitations of event cameras by introducing an artificial microsaccade mechanism, improving texture stability and data quality. The system's design and performance demonstrate its effectiveness in enhancing event-based vision for robotics applications.This paper proposes an artificial microsaccade-enhanced event camera (AMI-EV) for robotics, inspired by human microsaccades to maintain stable texture and high informational output. Event cameras, which respond to motion, struggle with texture stability and persistent information due to their sensor limitations. AMI-EV uses a rotating wedge prism to redirect light and trigger events, enabling stable texture appearance and high informational output independent of external motion. The system integrates hardware and software to maintain texture stability and improve data quality in scenarios where standard and event cameras fail. The AMI-EV system was validated through various experiments, demonstrating its effectiveness in low- and high-level vision tasks. The system's design allows for continuous light steering, improving efficiency and enabling the capture of events at all scene edges. The AMI-EV system also includes a compensation algorithm to account for motion, ensuring compatibility with existing event-based algorithms. The system's performance was evaluated in texture enhancement, feature detection, motion segmentation, and human detection tasks, showing superior results compared to standard cameras and event cameras. The AMI-EV system also includes a simulator and translator for generating synthetic data, facilitating future research. The system's hardware and software design enable robust performance in challenging environments, with potential applications in various robotics tasks. The proposed solution addresses the limitations of event cameras by introducing an artificial microsaccade mechanism, improving texture stability and data quality. The system's design and performance demonstrate its effectiveness in enhancing event-based vision for robotics applications.