StylizedGS: Controllable Stylization for 3D Gaussian Splatting

TPAMI 2025

1University of Chinese Academy of Sciences 2Beijing Key Laboratory of Mobile Computing and Pervasive Device, Institute of Computing Technology, Chinese Academy of Sciences 3Victoria University of Wellington 4Key Laboratory of Intelligent Information Processing, Institute of Computing Technology, Chinese Academy of Sciences *Corresponding Author
teaser

Given a 2D style image, the proposed StylizedGS method can stylize the pre-trained 3D Gaussian Splatting to match the desired style with detailed geometric features and satisfactory visual quality within a few minutes. We also enable users to control several perceptual factors, such as color, the style pattern size (scale), and the stylized regions (spatial), during the stylization to enhance the customization capabilities.


Video

Abstract

As XR technology continues to advance rapidly, 3D generation and editing are increasingly crucial. Among these, stylization plays a key role in enhancing the appearance of 3D models. By utilizing stylization, users can achieve consistent artistic effects in 3D editing using a single reference style image, making it a user-friendly editing method. However, recent NeRF-based 3D stylization methods encounter efficiency issues that impact the user experience, and their implicit nature limits their ability to accurately transfer geometric pattern styles. Additionally, the ability for artists to apply flexible control over stylized xxx highly desirable to foster an environment conducive to creative exploration. To address the above issues, we introduce StylizedGS, an efficient 3D neural style transfer framework with adaptable control over perceptual factors based on 3D Gaussian Splatting (3DGS) representation. We propose a filter-based refinement to eliminate floaters that affect the stylization effects in the scene reconstruction process. The nearest neighbor-based style loss is introduced to achieve stylization by fine-tuning the geometry and color parameters of 3DGS, while a depth preservation loss with other regularizations is proposed to prevent the tampering of geometry content. Moreover, facilitated by specially designed losses, StylizedGS enables users to control color, stylized scale, and regions during the stylization to possess customization capabilities. Our method achieves high-quality stylization results characterized by faithful brushstrokes and geometric consistency with flexible controls. Extensive experiments across various xxx demonstrate the effectiveness and efficiency of our method concerning both stylization quality and inference speed.


Pipeline

method

We first reconstruct a photo-realistic 3DGS $G_{\theta}^{rec}$ from multi-view input. Following this, color matching with the style image is performed, accompanied by the filter-based refinement to preemptively address potential artifacts. During optimization, we employ multiple loss terms to capture detailed local style structures and preserve geometric attributes. Users can flexibly control color, scale, and spatial attributes during stylization through customizable loss terms. Once this stylization is done, we can obtain consistent free-viewpoint stylized renderings.

Results Gallery

Color Control

Scale Control

Spatial Control

Sequential Control

BibTeX

@article{zhang2024stylizedgscontrollablestylization3d,
      title={StylizedGS: Controllable Stylization for 3D Gaussian Splatting}, 
      author={Dingxi Zhang and Yu-Jie Yuan and Zhuoxun Chen and Fang-Lue Zhang and Zhenliang He and Shiguang Shan and Lin Gao},
      year={2024},
      eprint={2404.05220},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2404.05220}, 
}