DOGS: Distributed-Oriented Gaussian Splatting for Large-Scale 3D Reconstruction Via Gaussian Consensus

DOGS: Distributed-Oriented Gaussian Splatting for Large-Scale 3D Reconstruction Via Gaussian Consensus

29 Oct 2024 | Yu Chen, Gim Hee Lee
This paper proposes DOGS, a distributed training method for 3D Gaussian Splatting (3DGS) to accelerate large-scale 3D reconstruction. DOGS splits scenes into multiple blocks and trains them in parallel, maintaining a global 3DGS model on a master node while using local models on slave nodes. The local models are discarded after training, and only the global model is used for inference. The method ensures consistency between local and global 3DGS models through consensus, improving training efficiency and stability. DOGS reduces training time by 6+ times on large-scale scenes while achieving state-of-the-art rendering quality. The method is evaluated on large-scale datasets, showing superior performance compared to existing methods. DOGS also addresses the challenges of high GPU memory usage and long training times in 3DGS. The code is publicly available.This paper proposes DOGS, a distributed training method for 3D Gaussian Splatting (3DGS) to accelerate large-scale 3D reconstruction. DOGS splits scenes into multiple blocks and trains them in parallel, maintaining a global 3DGS model on a master node while using local models on slave nodes. The local models are discarded after training, and only the global model is used for inference. The method ensures consistency between local and global 3DGS models through consensus, improving training efficiency and stability. DOGS reduces training time by 6+ times on large-scale scenes while achieving state-of-the-art rendering quality. The method is evaluated on large-scale datasets, showing superior performance compared to existing methods. DOGS also addresses the challenges of high GPU memory usage and long training times in 3DGS. The code is publicly available.
Reach us at info@futurestudyspace.com