Microsaccade-inspired Event Camera for Robotics

Microsaccade-inspired Event Camera for Robotics

28 May 2024 | Botao He*,1,2,†, Ze Wang3,4, Yuan Zhou2,3, Jingxi Chen1, Chahat Deep Singh1, Haojia Li5, Yuman Gao2,3, Shaojie Shen5, Kaiwei Wang4, Yanjun Cao3, Chao Xu2,3, Yiannis Aloimonos1,6,7, Fei Gao*2,3, and Cornelia Fermüller*1,6,7
The paper presents an innovative event camera system designed to enhance visual perception in robotics by emulating the microsaccade mechanism found in human vision. Microsaccades, small involuntary eye movements, help maintain texture stability and persistence during fixation, which is crucial for accurate perception. The proposed system, called Artificial Microsaccade-enhanced Event camera (AMI-EV), uses a rotating wedge prism to redirect light and trigger events, ensuring that all edges of the scene are captured, regardless of the camera's motion. This design addresses the challenge of texture instability and information loss in event cameras, which can fail to capture object edges parallel to the camera's motion. The AMI-EV system is validated through various experiments, demonstrating superior performance in texture stability, feature detection, and high-level vision tasks such as human detection and pose estimation. The system's hardware and software components are integrated, and a simulator and translator are provided to facilitate future research and application development. The results show that the AMI-EV system effectively improves the quality and reliability of event-based vision in robotics, making it a promising tool for a wide range of applications.The paper presents an innovative event camera system designed to enhance visual perception in robotics by emulating the microsaccade mechanism found in human vision. Microsaccades, small involuntary eye movements, help maintain texture stability and persistence during fixation, which is crucial for accurate perception. The proposed system, called Artificial Microsaccade-enhanced Event camera (AMI-EV), uses a rotating wedge prism to redirect light and trigger events, ensuring that all edges of the scene are captured, regardless of the camera's motion. This design addresses the challenge of texture instability and information loss in event cameras, which can fail to capture object edges parallel to the camera's motion. The AMI-EV system is validated through various experiments, demonstrating superior performance in texture stability, feature detection, and high-level vision tasks such as human detection and pose estimation. The system's hardware and software components are integrated, and a simulator and translator are provided to facilitate future research and application development. The results show that the AMI-EV system effectively improves the quality and reliability of event-based vision in robotics, making it a promising tool for a wide range of applications.
Reach us at info@study.space
[slides] Microsaccade-inspired event camera for robotics | StudySpace