Event-based Vision: A Survey

Event-based Vision: A Survey

2020 | Guillermo Gallego, Tobi Delbrück, Garrick Orchard, Chiara Bartolozzi, Brian Taba, Andrea Censi, Stefan Leutenegger, Andrew J. Davison, Jörg Conradt, Kostas Daniilidis, Davide Scaramuzza
Event cameras are bio-inspired sensors that differ from conventional frame cameras by asynchronously measuring per-pixel brightness changes and outputting a stream of events encoding time, location, and sign of brightness changes. They offer high temporal resolution (microseconds), high dynamic range (140 dB), low power consumption, and high pixel bandwidth, making them suitable for robotics and computer vision in challenging scenarios like low-latency, high-speed, and high dynamic range. However, novel methods are needed to process their unconventional output. This paper provides a comprehensive overview of event-based vision, covering applications and algorithms for unlocking event cameras' potential. It discusses event cameras' working principles, available sensors, and tasks they have been used for, from low-level vision (feature detection, optic flow) to high-level vision (reconstruction, segmentation, recognition). It also covers techniques for processing events, including learning-based methods and specialized processors like spiking neural networks. Challenges and opportunities for future research are highlighted. The paper also reviews neuromorphic processors, software, datasets, and simulators for event cameras, as well as future research directions and open challenges. Event cameras are asynchronous sensors that sample light based on scene dynamics, offering high temporal resolution, low latency, and high dynamic range. They are used in applications like object tracking, surveillance, and SLAM. Event cameras have a large potential for robotics and wearable applications in challenging scenarios for standard cameras. However, because event cameras work differently from standard cameras, novel methods are required to process their output. Event cameras are data-driven sensors, with output depending on scene motion or brightness changes. They have high temporal resolution and low latency, making them suitable for real-time interaction systems. Event cameras have been used in applications like object tracking, surveillance, and SLAM. They are also profitable for depth estimation, structured light 3D scanning, optical flow estimation, HDR image reconstruction, and SLAM. Event-based vision is a growing field of research, with applications like image deblurring and star tracking expected to emerge as event cameras become widely available. The paper outlines the principles of operation of event cameras, their advantages, and challenges due to the novel sensing paradigm. Event cameras have independent pixels that respond to brightness changes, with each event representing a change in brightness. They are data-driven sensors, with output depending on scene motion or brightness changes. Event cameras have high temporal resolution and low latency, making them suitable for real-time interaction systems. Event cameras are used in applications like object tracking, surveillance, and SLAM. They are also profitable for depth estimation, structured light 3D scanning, optical flow estimation, HDR image reconstruction, and SLAM. Event-based vision is a growing field of research, with applications like image deblurring and star tracking expected to emerge as event cameras become widely available. The paper outlines the principles of operation of event cameras, their advantages, and challenges due to the novel sensing paradigm. Event cameras have independent pixels that respond to brightnessEvent cameras are bio-inspired sensors that differ from conventional frame cameras by asynchronously measuring per-pixel brightness changes and outputting a stream of events encoding time, location, and sign of brightness changes. They offer high temporal resolution (microseconds), high dynamic range (140 dB), low power consumption, and high pixel bandwidth, making them suitable for robotics and computer vision in challenging scenarios like low-latency, high-speed, and high dynamic range. However, novel methods are needed to process their unconventional output. This paper provides a comprehensive overview of event-based vision, covering applications and algorithms for unlocking event cameras' potential. It discusses event cameras' working principles, available sensors, and tasks they have been used for, from low-level vision (feature detection, optic flow) to high-level vision (reconstruction, segmentation, recognition). It also covers techniques for processing events, including learning-based methods and specialized processors like spiking neural networks. Challenges and opportunities for future research are highlighted. The paper also reviews neuromorphic processors, software, datasets, and simulators for event cameras, as well as future research directions and open challenges. Event cameras are asynchronous sensors that sample light based on scene dynamics, offering high temporal resolution, low latency, and high dynamic range. They are used in applications like object tracking, surveillance, and SLAM. Event cameras have a large potential for robotics and wearable applications in challenging scenarios for standard cameras. However, because event cameras work differently from standard cameras, novel methods are required to process their output. Event cameras are data-driven sensors, with output depending on scene motion or brightness changes. They have high temporal resolution and low latency, making them suitable for real-time interaction systems. Event cameras have been used in applications like object tracking, surveillance, and SLAM. They are also profitable for depth estimation, structured light 3D scanning, optical flow estimation, HDR image reconstruction, and SLAM. Event-based vision is a growing field of research, with applications like image deblurring and star tracking expected to emerge as event cameras become widely available. The paper outlines the principles of operation of event cameras, their advantages, and challenges due to the novel sensing paradigm. Event cameras have independent pixels that respond to brightness changes, with each event representing a change in brightness. They are data-driven sensors, with output depending on scene motion or brightness changes. Event cameras have high temporal resolution and low latency, making them suitable for real-time interaction systems. Event cameras are used in applications like object tracking, surveillance, and SLAM. They are also profitable for depth estimation, structured light 3D scanning, optical flow estimation, HDR image reconstruction, and SLAM. Event-based vision is a growing field of research, with applications like image deblurring and star tracking expected to emerge as event cameras become widely available. The paper outlines the principles of operation of event cameras, their advantages, and challenges due to the novel sensing paradigm. Event cameras have independent pixels that respond to brightness
Reach us at info@study.space
[slides and audio] Event-Based Vision%3A A Survey