Regional Style and Color Transfer

Regional Style and Color Transfer

26 Jun 2024 | Zhicheng Ding, Panfeng Li, Qikai Yang, Siyang Li, Qingtian Gong
This paper introduces a novel approach to regional style and color transfer, addressing the limitations of existing methods that often apply style homogeneously across the entire image, leading to stylistic inconsistencies. The proposed method leverages a segmentation network to isolate foreground objects within the input image, applying style transfer exclusively to the background region. The isolated foreground objects are then reintegrated into the style-transferred background, with a color transfer step enhancing visual coherence. Feathering techniques are used to achieve a seamless blend, resulting in a visually unified and aesthetically pleasing final composition. Extensive evaluations demonstrate that the proposed approach yields more natural stylistic transformations compared to conventional methods. The key contributions of the paper include: - A regional style and color transfer approach designed to overcome the unnatural effects of global style or color transfer. - demonstrate the advantages of applying style and color transfer within distinct image regions. - The method's capability to extend to establish a correspondence between painted images and video frames, enabling the creation of stylized films. - An effective framework for seamlessly applying and switching between style transfer algorithms using segmentation techniques. The paper details the methods used, including semantic segmentation with DeepLabv3+, boundary optimization, artistic style transfer, and color transfer. The results section evaluates the impact of boundary optimization, background style transfer, foreground color transfer, and image blending. The effectiveness of the proposed method is demonstrated through comparative analyses, showing its ability to achieve visually harmonious and aesthetically pleasing outcomes. The conclusions highlight the method's potential for enhancing visual quality and realism, with future research directions focusing on expanding applicability and optimizing inference speed.This paper introduces a novel approach to regional style and color transfer, addressing the limitations of existing methods that often apply style homogeneously across the entire image, leading to stylistic inconsistencies. The proposed method leverages a segmentation network to isolate foreground objects within the input image, applying style transfer exclusively to the background region. The isolated foreground objects are then reintegrated into the style-transferred background, with a color transfer step enhancing visual coherence. Feathering techniques are used to achieve a seamless blend, resulting in a visually unified and aesthetically pleasing final composition. Extensive evaluations demonstrate that the proposed approach yields more natural stylistic transformations compared to conventional methods. The key contributions of the paper include: - A regional style and color transfer approach designed to overcome the unnatural effects of global style or color transfer. - demonstrate the advantages of applying style and color transfer within distinct image regions. - The method's capability to extend to establish a correspondence between painted images and video frames, enabling the creation of stylized films. - An effective framework for seamlessly applying and switching between style transfer algorithms using segmentation techniques. The paper details the methods used, including semantic segmentation with DeepLabv3+, boundary optimization, artistic style transfer, and color transfer. The results section evaluates the impact of boundary optimization, background style transfer, foreground color transfer, and image blending. The effectiveness of the proposed method is demonstrated through comparative analyses, showing its ability to achieve visually harmonious and aesthetically pleasing outcomes. The conclusions highlight the method's potential for enhancing visual quality and realism, with future research directions focusing on expanding applicability and optimizing inference speed.
Reach us at info@study.space