13 Aug 2024 | Dingxi Zhang, Yu-Jie Yuan, Zhuoxun Chen, Fang-Lue Zhang, Zhenliang He, Shiguang Shan, and Lin Gao
StylizedGS is a controllable 3D neural style transfer framework based on 3D Gaussian Splatting (3DGS) that enables efficient and flexible stylization of 3D scenes. The method addresses the limitations of existing NeRF-based approaches by introducing a filter-based refinement to eliminate floaters and a nearest neighbor-based style loss to fine-tune geometry and color parameters of 3DGS. It also incorporates a depth preservation loss to maintain geometric consistency and allows users to control color, scale, and spatial regions during stylization. The framework achieves high-quality stylization with faithful brushstrokes and geometric consistency, and extensive experiments demonstrate its effectiveness and efficiency across various scenes and styles. StylizedGS enables users to customize stylization through perceptual control, offering flexible and interactive control over style features. The method is efficient, with fast inference speeds and reduced training times compared to existing 3D stylization methods. It supports real-time rendering and provides a user-friendly interface for interactive scene stylization. The approach is validated through qualitative and quantitative comparisons with state-of-the-art methods, showing superior performance in style matching, content preservation, and efficiency. The method also includes ablation studies to validate its design choices, demonstrating the importance of depth preservation, density optimization, and 3DGS filter in achieving high-quality stylization. Overall, StylizedGS offers a novel and effective solution for 3D scene stylization with controllable and customizable style transfer.StylizedGS is a controllable 3D neural style transfer framework based on 3D Gaussian Splatting (3DGS) that enables efficient and flexible stylization of 3D scenes. The method addresses the limitations of existing NeRF-based approaches by introducing a filter-based refinement to eliminate floaters and a nearest neighbor-based style loss to fine-tune geometry and color parameters of 3DGS. It also incorporates a depth preservation loss to maintain geometric consistency and allows users to control color, scale, and spatial regions during stylization. The framework achieves high-quality stylization with faithful brushstrokes and geometric consistency, and extensive experiments demonstrate its effectiveness and efficiency across various scenes and styles. StylizedGS enables users to customize stylization through perceptual control, offering flexible and interactive control over style features. The method is efficient, with fast inference speeds and reduced training times compared to existing 3D stylization methods. It supports real-time rendering and provides a user-friendly interface for interactive scene stylization. The approach is validated through qualitative and quantitative comparisons with state-of-the-art methods, showing superior performance in style matching, content preservation, and efficiency. The method also includes ablation studies to validate its design choices, demonstrating the importance of depth preservation, density optimization, and 3DGS filter in achieving high-quality stylization. Overall, StylizedGS offers a novel and effective solution for 3D scene stylization with controllable and customizable style transfer.