Universal Style Transfer via Feature Transforms

Universal Style Transfer via Feature Transforms

17 Nov 2017 | Yijun Li, Chen Fang, Jimei Yang, Zhaowen Wang, Xin Lu, Ming-Hsuan Yang
This paper proposes a universal style transfer method that can transfer arbitrary visual styles to content images without requiring pre-defined styles or training on specific styles. The method uses feature transforms, specifically whitening and coloring, to match the statistical characteristics of content features to those of style features. These transforms are embedded into an image reconstruction network, enabling the transfer process to be implemented through simple feed-forward operations. The method is trained using a pre-trained encoder-decoder network for image reconstruction, and the style transfer is achieved by applying the whitening and coloring transforms to the content features. The transformed features are then fed into the decoder to generate the stylized image. The method also includes a multi-level stylization pipeline that applies the transforms to multiple feature layers, resulting in higher quality stylized images with lower computational costs. The method is evaluated on various tasks, including style transfer and texture synthesis, and is shown to achieve high-quality results with a balance between style transfer and content preservation. The method is also shown to be efficient and flexible, allowing for user control over the style transfer process through parameters such as scale, weight, and spatial control. The method is compared with existing style transfer methods and is shown to achieve better results in terms of quality, efficiency, and user control. The method is also shown to be effective for texture synthesis, generating diverse and visually pleasing results. The method is supported by extensive experiments and comparisons with other methods, demonstrating its effectiveness and versatility in style transfer and texture synthesis tasks.This paper proposes a universal style transfer method that can transfer arbitrary visual styles to content images without requiring pre-defined styles or training on specific styles. The method uses feature transforms, specifically whitening and coloring, to match the statistical characteristics of content features to those of style features. These transforms are embedded into an image reconstruction network, enabling the transfer process to be implemented through simple feed-forward operations. The method is trained using a pre-trained encoder-decoder network for image reconstruction, and the style transfer is achieved by applying the whitening and coloring transforms to the content features. The transformed features are then fed into the decoder to generate the stylized image. The method also includes a multi-level stylization pipeline that applies the transforms to multiple feature layers, resulting in higher quality stylized images with lower computational costs. The method is evaluated on various tasks, including style transfer and texture synthesis, and is shown to achieve high-quality results with a balance between style transfer and content preservation. The method is also shown to be efficient and flexible, allowing for user control over the style transfer process through parameters such as scale, weight, and spatial control. The method is compared with existing style transfer methods and is shown to achieve better results in terms of quality, efficiency, and user control. The method is also shown to be effective for texture synthesis, generating diverse and visually pleasing results. The method is supported by extensive experiments and comparisons with other methods, demonstrating its effectiveness and versatility in style transfer and texture synthesis tasks.
Reach us at info@study.space
[slides] Universal Style Transfer via Feature Transforms | StudySpace