CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM System using Contrast Maximization

CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM System using Contrast Maximization

2024 | Shuang Guo and Guillermo Gallego
CMax-SLAM: An event-based rotational motion bundle adjustment and SLAM system using contrast maximization Shuang Guo and Guillermo Gallego This paper presents CMax-SLAM, an event-based rotational motion estimation and SLAM system that uses contrast maximization (CMax) for bundle adjustment. Event cameras, which capture pixel-wise intensity changes and output asynchronous event streams, are well-suited for high-speed and high dynamic range scenarios in robotics and computer vision. However, existing event-based rotation estimation methods have not been evaluated under unified criteria, and do not include a global refinement step. The authors conduct a systematic study of event-based rotational motion estimation, comparing previous works theoretically and experimentally. They propose the first event-based rotation-only bundle adjustment (BA) approach, leveraging the CMax framework, which avoids the need to convert events into frames. This BA is used to build CMax-SLAM, the first event-based rotation-only SLAM system comprising a front-end and a back-end. The BA can run both offline (trajectory smoothing) and online (CMax-SLAM back-end). The authors demonstrate the performance and versatility of their method on synthetic and real-world datasets, including indoor, outdoor, and space scenarios. They discuss the pitfalls of real-world evaluation and propose a proxy for the reprojection error as the figure of merit to evaluate event-based rotation BA methods. They release the source code and novel data sequences to benefit the community. The contributions of this work include: (1) a theoretical comparison and experimental benchmark of several event-based rotational motion estimation methods under unified criteria; (2) the first event-based rotation-only BA method to refine the continuous-time trajectory of an event camera while reconstructing a sharp panoramic map; (3) the first event-based rotation-only SLAM system, CMax-SLAM, comprising both a front-end and a back-end; (4) demonstration of the method on a variety of scenarios, showing the versatility of the approach; (5) highlighting the potential pitfalls of evaluating rotation-only methods on non-strictly rotational data and proposing a sensible figure of merit for event-based rotational BA in real-world scenarios; (6) releasing the source code and novel data sequences with high (VGA) spatial resolution. The paper also presents a comprehensive comparative evaluation of the rotation-only motion estimators, including experiments on synthetic and real-world data, discussing the issues of the latter. It assesses the runtime of the methods, demonstrates CMax-SLAM in complex scenes, and shows its super-resolution capabilities. Finally, it presents a sensitivity analysis.CMax-SLAM: An event-based rotational motion bundle adjustment and SLAM system using contrast maximization Shuang Guo and Guillermo Gallego This paper presents CMax-SLAM, an event-based rotational motion estimation and SLAM system that uses contrast maximization (CMax) for bundle adjustment. Event cameras, which capture pixel-wise intensity changes and output asynchronous event streams, are well-suited for high-speed and high dynamic range scenarios in robotics and computer vision. However, existing event-based rotation estimation methods have not been evaluated under unified criteria, and do not include a global refinement step. The authors conduct a systematic study of event-based rotational motion estimation, comparing previous works theoretically and experimentally. They propose the first event-based rotation-only bundle adjustment (BA) approach, leveraging the CMax framework, which avoids the need to convert events into frames. This BA is used to build CMax-SLAM, the first event-based rotation-only SLAM system comprising a front-end and a back-end. The BA can run both offline (trajectory smoothing) and online (CMax-SLAM back-end). The authors demonstrate the performance and versatility of their method on synthetic and real-world datasets, including indoor, outdoor, and space scenarios. They discuss the pitfalls of real-world evaluation and propose a proxy for the reprojection error as the figure of merit to evaluate event-based rotation BA methods. They release the source code and novel data sequences to benefit the community. The contributions of this work include: (1) a theoretical comparison and experimental benchmark of several event-based rotational motion estimation methods under unified criteria; (2) the first event-based rotation-only BA method to refine the continuous-time trajectory of an event camera while reconstructing a sharp panoramic map; (3) the first event-based rotation-only SLAM system, CMax-SLAM, comprising both a front-end and a back-end; (4) demonstration of the method on a variety of scenarios, showing the versatility of the approach; (5) highlighting the potential pitfalls of evaluating rotation-only methods on non-strictly rotational data and proposing a sensible figure of merit for event-based rotational BA in real-world scenarios; (6) releasing the source code and novel data sequences with high (VGA) spatial resolution. The paper also presents a comprehensive comparative evaluation of the rotation-only motion estimators, including experiments on synthetic and real-world data, discussing the issues of the latter. It assesses the runtime of the methods, demonstrates CMax-SLAM in complex scenes, and shows its super-resolution capabilities. Finally, it presents a sensitivity analysis.
Reach us at info@study.space
[slides and audio] CMax-SLAM%3A Event-Based Rotational-Motion Bundle Adjustment and SLAM System Using Contrast Maximization