| Richard A. Newcombe, Shahram Izadi, Otmar Hilliges, David Molyneaux, David Kim, Andrew J. Davison, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Andrew Fitzgibbon
The paper presents a system for real-time, dense surface mapping and tracking of complex indoor scenes using a handheld Kinect depth camera and commodity graphics hardware. The system fuses depth data from the Kinect sensor into a global implicit surface model in real-time, while simultaneously tracking the sensor's pose using a coarse-to-fine iterative closest point (ICP) algorithm. This approach ensures that tracking is always relative to the fully up-to-date fused dense model, providing advantages over frame-to-frame tracking. The system can operate in complete darkness and is designed to enable high-quality, real-time reconstruction and tracking, which is crucial for augmented reality (AR) applications. The use of general-purpose GPU (GPGPU) techniques allows the system to perform at the frame rate of the Kinect sensor, achieving constant-time performance. The paper includes qualitative and quantitative results demonstrating the system's performance in various aspects of tracking and mapping.The paper presents a system for real-time, dense surface mapping and tracking of complex indoor scenes using a handheld Kinect depth camera and commodity graphics hardware. The system fuses depth data from the Kinect sensor into a global implicit surface model in real-time, while simultaneously tracking the sensor's pose using a coarse-to-fine iterative closest point (ICP) algorithm. This approach ensures that tracking is always relative to the fully up-to-date fused dense model, providing advantages over frame-to-frame tracking. The system can operate in complete darkness and is designed to enable high-quality, real-time reconstruction and tracking, which is crucial for augmented reality (AR) applications. The use of general-purpose GPU (GPGPU) techniques allows the system to perform at the frame rate of the Kinect sensor, achieving constant-time performance. The paper includes qualitative and quantitative results demonstrating the system's performance in various aspects of tracking and mapping.