July 27-August 1, 2024 | BINBIN HUANG, ZEHAO YU, ANPEI CHEN, ANDREAS GEIGER, SHENGHUA GAO
This paper introduces 2D Gaussian Splatting (2DGS), a novel method for geometrically accurate radiance field reconstruction from multi-view images. Unlike 3D Gaussian Splatting (3DGS), which models 3D Gaussians, 2DGS represents scenes using 2D oriented planar Gaussian disks. This approach enables view-consistent geometry modeling and rendering, as well as accurate surface reconstruction. The key idea is to collapse the 3D volume into a set of 2D oriented planar Gaussian disks, which are tightly aligned to the surfaces of the scene.
The method uses a perspective-accurate 2D splatting process that utilizes ray-splat intersection and rasterization to ensure accurate geometry representation. Additionally, depth distortion and normal consistency terms are incorporated to enhance the quality of the reconstructions. The differentiable renderer allows for noise-free and detailed geometry reconstruction while maintaining competitive appearance quality, fast training speed, and real-time rendering.
The paper compares 2DGS with 3DGS and other methods, demonstrating that 2DGS achieves state-of-the-art geometry reconstruction and novel view synthesis results. It also shows that 2DGS is significantly faster than previous implicit neural surface representations. The method is evaluated on several datasets, including DTU and Tanks and Temples, where it achieves high reconstruction accuracy and efficiency. The results show that 2DGS outperforms other methods in terms of Chamfer distance, PSNR, and reconstruction speed. The method is also able to produce detailed and noise-free surface reconstructions, with sharp edges and intricate details. The paper concludes that 2DGS is an effective and efficient approach for geometrically accurate radiance field reconstruction.This paper introduces 2D Gaussian Splatting (2DGS), a novel method for geometrically accurate radiance field reconstruction from multi-view images. Unlike 3D Gaussian Splatting (3DGS), which models 3D Gaussians, 2DGS represents scenes using 2D oriented planar Gaussian disks. This approach enables view-consistent geometry modeling and rendering, as well as accurate surface reconstruction. The key idea is to collapse the 3D volume into a set of 2D oriented planar Gaussian disks, which are tightly aligned to the surfaces of the scene.
The method uses a perspective-accurate 2D splatting process that utilizes ray-splat intersection and rasterization to ensure accurate geometry representation. Additionally, depth distortion and normal consistency terms are incorporated to enhance the quality of the reconstructions. The differentiable renderer allows for noise-free and detailed geometry reconstruction while maintaining competitive appearance quality, fast training speed, and real-time rendering.
The paper compares 2DGS with 3DGS and other methods, demonstrating that 2DGS achieves state-of-the-art geometry reconstruction and novel view synthesis results. It also shows that 2DGS is significantly faster than previous implicit neural surface representations. The method is evaluated on several datasets, including DTU and Tanks and Temples, where it achieves high reconstruction accuracy and efficiency. The results show that 2DGS outperforms other methods in terms of Chamfer distance, PSNR, and reconstruction speed. The method is also able to produce detailed and noise-free surface reconstructions, with sharp edges and intricate details. The paper concludes that 2DGS is an effective and efficient approach for geometrically accurate radiance field reconstruction.