17 Jul 2024 | Matias Turkulainen, Xuqian Ren, Iaroslav Melekhov, Otto Seiskari, Esa Rahtu, Juho Kannala
DN-Splatter introduces depth and normal priors to enhance 3D Gaussian splatting for indoor scene reconstruction. The method improves photorealism and surface reconstruction by incorporating depth and normal cues during optimization. It regularizes Gaussian positions with edge-aware depth constraints and estimates normals from Gaussians to align with real surface boundaries. This approach enables efficient mesh extraction directly from the Gaussian representation, resulting in more accurate and physically plausible reconstructions. The method uses monocular networks for depth and normal estimation, and proposes an adaptive depth loss based on color image gradients to improve depth estimation and novel view synthesis. Experiments on indoor datasets show that DN-Splatter outperforms baselines in mesh reconstruction and novel view synthesis, achieving better results in challenging scenarios. The method is implemented in PyTorch and gsplat, and the code is available on GitHub.DN-Splatter introduces depth and normal priors to enhance 3D Gaussian splatting for indoor scene reconstruction. The method improves photorealism and surface reconstruction by incorporating depth and normal cues during optimization. It regularizes Gaussian positions with edge-aware depth constraints and estimates normals from Gaussians to align with real surface boundaries. This approach enables efficient mesh extraction directly from the Gaussian representation, resulting in more accurate and physically plausible reconstructions. The method uses monocular networks for depth and normal estimation, and proposes an adaptive depth loss based on color image gradients to improve depth estimation and novel view synthesis. Experiments on indoor datasets show that DN-Splatter outperforms baselines in mesh reconstruction and novel view synthesis, achieving better results in challenging scenarios. The method is implemented in PyTorch and gsplat, and the code is available on GitHub.