This paper proposes DOGS, a distributed training method for 3D Gaussian Splatting (3DGS) to accelerate large-scale 3D reconstruction. DOGS splits scenes into multiple blocks and trains them in parallel, maintaining a global 3DGS model on a master node while using local models on slave nodes. The local models are discarded after training, and only the global model is used for inference. The method ensures consistency between local and global 3DGS models through consensus, improving training efficiency and stability. DOGS reduces training time by 6+ times on large-scale scenes while achieving state-of-the-art rendering quality. The method is evaluated on large-scale datasets, showing superior performance compared to existing methods. DOGS also addresses the challenges of high GPU memory usage and long training times in 3DGS. The code is publicly available.This paper proposes DOGS, a distributed training method for 3D Gaussian Splatting (3DGS) to accelerate large-scale 3D reconstruction. DOGS splits scenes into multiple blocks and trains them in parallel, maintaining a global 3DGS model on a master node while using local models on slave nodes. The local models are discarded after training, and only the global model is used for inference. The method ensures consistency between local and global 3DGS models through consensus, improving training efficiency and stability. DOGS reduces training time by 6+ times on large-scale scenes while achieving state-of-the-art rendering quality. The method is evaluated on large-scale datasets, showing superior performance compared to existing methods. DOGS also addresses the challenges of high GPU memory usage and long training times in 3DGS. The code is publicly available.