SA-GS: Scale-Adaptive Gaussian Splatting for Training-Free Anti-Aliasing

SA-GS: Scale-Adaptive Gaussian Splatting for Training-Free Anti-Aliasing

28 Mar 2024 | Xiaowei Song*1,2, Jv Zheng*1,3, Shiran Yuan1,4, Huan-ang Gao1, Jingwei Zhao5, Xiang He5, Weihao Gu5, and Hao Zhao†1
The paper introduces a training-free method called Scale-adaptive Gaussian Splatting (SA-GS) to improve the anti-aliasing performance of 3D Gaussian Splatting (3DGS). 3DGS, a neural rendering paradigm, uses Gaussian primitives to represent 3D scenes, but it suffers from scale mismatch and aliasing issues when rendered at different resolutions. The core contribution of SA-GS is a 2D scale-adaptive filter that maintains the consistency of Gaussian projection scales across different rendering settings. This filter ensures that the Gaussian distribution remains consistent with the training setup, allowing conventional anti-aliasing techniques like super-sampling and integration to work effectively. Super-sampling increases the sampling rate to reduce aliasing, while integration integrates the projected Gaussian within each pixel. Extensive experiments on the Mip-NeRF 360 and Blender datasets show that SA-GS performs comparably or better than state-of-the-art methods, particularly in zoom-out scenarios, where it outperforms Mip-Splatting. The method is flexible, elegant, and accurate, making it a significant advancement in 3D scene rendering.The paper introduces a training-free method called Scale-adaptive Gaussian Splatting (SA-GS) to improve the anti-aliasing performance of 3D Gaussian Splatting (3DGS). 3DGS, a neural rendering paradigm, uses Gaussian primitives to represent 3D scenes, but it suffers from scale mismatch and aliasing issues when rendered at different resolutions. The core contribution of SA-GS is a 2D scale-adaptive filter that maintains the consistency of Gaussian projection scales across different rendering settings. This filter ensures that the Gaussian distribution remains consistent with the training setup, allowing conventional anti-aliasing techniques like super-sampling and integration to work effectively. Super-sampling increases the sampling rate to reduce aliasing, while integration integrates the projected Gaussian within each pixel. Extensive experiments on the Mip-NeRF 360 and Blender datasets show that SA-GS performs comparably or better than state-of-the-art methods, particularly in zoom-out scenarios, where it outperforms Mip-Splatting. The method is flexible, elegant, and accurate, making it a significant advancement in 3D scene rendering.
Reach us at info@study.space