TextureDreamer: Image-guided Texture Synthesis through Geometry-aware Diffusion

TextureDreamer: Image-guided Texture Synthesis through Geometry-aware Diffusion

17 Jan 2024 | Yu-Ying Yeh, Jia-Bin Huang, Changil Kim, Lei Xiao, Thu Nguyen-Phuoc, Numair Khan, Cheng Zhang, Manmohan Chandraker, Carl S Marshall, Zhao Dong, Zhengqin Li
TextureDreamer is a novel image-guided texture synthesis method that transfers relightable textures from a small number of input images (3 to 5) to target 3D shapes across arbitrary categories. The method addresses the challenge of creating high-quality, detailed textures for arbitrary objects using only a few casually captured images. It introduces a core idea called personalized geometry-aware score distillation (PGSD), which draws inspiration from recent advancements in diffusion-based generative models. PGSD enables the extraction of texture information from a small set of input images by fine-tuning a pre-trained diffusion model with a unique text token. The method integrates several essential modifications to improve texture quality and addresses the challenge of accurately describing complex textures. The method uses a combination of variational score distillation (VSD) and explicit geometry guidance with ControlNet to generate photorealistic and diverse textures. VSD treats the entire 3D representation as a random variable and aligns its distribution with the pre-trained diffusion model. ControlNet is used to inject geometry information extracted from the given mesh into the diffusion model, ensuring that the generated textures are aligned with the target geometry. This approach significantly improves the 3D consistency of the generated textures. Experiments on real images spanning different categories show that TextureDreamer can successfully transfer highly realistic, semantically meaningful textures to arbitrary objects, surpassing the visual quality of previous state-of-the-art methods. The method is evaluated on a variety of object categories, including sofas, beds, mugs, and plush toys, and demonstrates its ability to transfer textures across different categories. The results show that TextureDreamer can generate diverse and high-quality textures that are visually appealing and semantically meaningful. The method is also evaluated using a user study, which confirms its effectiveness in transferring textures to different 3D shapes. The results show that TextureDreamer outperforms existing methods in terms of texture quality, photorealism, and texture-geometry compatibility.TextureDreamer is a novel image-guided texture synthesis method that transfers relightable textures from a small number of input images (3 to 5) to target 3D shapes across arbitrary categories. The method addresses the challenge of creating high-quality, detailed textures for arbitrary objects using only a few casually captured images. It introduces a core idea called personalized geometry-aware score distillation (PGSD), which draws inspiration from recent advancements in diffusion-based generative models. PGSD enables the extraction of texture information from a small set of input images by fine-tuning a pre-trained diffusion model with a unique text token. The method integrates several essential modifications to improve texture quality and addresses the challenge of accurately describing complex textures. The method uses a combination of variational score distillation (VSD) and explicit geometry guidance with ControlNet to generate photorealistic and diverse textures. VSD treats the entire 3D representation as a random variable and aligns its distribution with the pre-trained diffusion model. ControlNet is used to inject geometry information extracted from the given mesh into the diffusion model, ensuring that the generated textures are aligned with the target geometry. This approach significantly improves the 3D consistency of the generated textures. Experiments on real images spanning different categories show that TextureDreamer can successfully transfer highly realistic, semantically meaningful textures to arbitrary objects, surpassing the visual quality of previous state-of-the-art methods. The method is evaluated on a variety of object categories, including sofas, beds, mugs, and plush toys, and demonstrates its ability to transfer textures across different categories. The results show that TextureDreamer can generate diverse and high-quality textures that are visually appealing and semantically meaningful. The method is also evaluated using a user study, which confirms its effectiveness in transferring textures to different 3D shapes. The results show that TextureDreamer outperforms existing methods in terms of texture quality, photorealism, and texture-geometry compatibility.
Reach us at info@study.space