3D Gaussian Splatting as Markov Chain Monte Carlo

3D Gaussian Splatting as Markov Chain Monte Carlo

17 Jun 2024 | Shakiba Kheradmand, Daniel Rebain, Gopal Sharma, Weiwei Sun, Yang-Che Tseng, Hossam Isack, Abhishek Kar, Andrea Tagliasacchi, Kwang Moo Yi
This paper proposes a novel approach to 3D Gaussian Splatting (3DGS) by interpreting the placement and optimization of Gaussians as a Markov Chain Monte Carlo (MCMC) sampling process. Instead of relying on heuristic-based strategies for cloning, splitting, and pruning Gaussians, the authors reformulate these operations as deterministic state transitions within an MCMC framework. They introduce a stochastic gradient Langevin dynamics (SGLD) update rule, which allows for efficient exploration of the Gaussian space while maintaining the fidelity of the rendered scene. The key idea is to treat the set of Gaussians as samples drawn from an underlying probability distribution that reflects the scene's structure. This approach eliminates the need for careful initialization and heuristics, leading to improved rendering quality, better control over the number of Gaussians, and increased robustness to initialization. The authors also introduce an L1-regularizer to encourage the removal of unused Gaussians, promoting a more efficient use of computational resources. Experiments on various standard datasets demonstrate that their method outperforms conventional 3DGS in terms of rendering quality and efficiency. The method is implemented using the 3DGS framework with PyTorch, and results are evaluated using standard metrics such as PSNR, SSIM, and LPIPS. The approach is shown to be effective in both synthetic and real-world scenarios, and it provides a more principled and robust alternative to existing heuristic-based methods.This paper proposes a novel approach to 3D Gaussian Splatting (3DGS) by interpreting the placement and optimization of Gaussians as a Markov Chain Monte Carlo (MCMC) sampling process. Instead of relying on heuristic-based strategies for cloning, splitting, and pruning Gaussians, the authors reformulate these operations as deterministic state transitions within an MCMC framework. They introduce a stochastic gradient Langevin dynamics (SGLD) update rule, which allows for efficient exploration of the Gaussian space while maintaining the fidelity of the rendered scene. The key idea is to treat the set of Gaussians as samples drawn from an underlying probability distribution that reflects the scene's structure. This approach eliminates the need for careful initialization and heuristics, leading to improved rendering quality, better control over the number of Gaussians, and increased robustness to initialization. The authors also introduce an L1-regularizer to encourage the removal of unused Gaussians, promoting a more efficient use of computational resources. Experiments on various standard datasets demonstrate that their method outperforms conventional 3DGS in terms of rendering quality and efficiency. The method is implemented using the 3DGS framework with PyTorch, and results are evaluated using standard metrics such as PSNR, SSIM, and LPIPS. The approach is shown to be effective in both synthetic and real-world scenarios, and it provides a more principled and robust alternative to existing heuristic-based methods.
Reach us at info@study.space
[slides and audio] 3D Gaussian Splatting as Markov Chain Monte Carlo