Regional Style and Color Transfer

Regional Style and Color Transfer

26 Jun 2024 | Zhicheng Ding, Panfeng Li, Qikai Yang, Siyang Li, Qingtian Gong
This paper presents a novel approach to regional style and color transfer, addressing the limitations of global style transfer methods that often produce unnatural results. The proposed method leverages object segmentation to isolate foreground objects within an image, applying style transfer to the background and color transfer to the foreground. The foreground elements are then carefully reintegrated into the style-transferred background, with color transfer applied prior to reintegration. Feathering techniques are used to achieve a seamless amalgamation of foreground and background, resulting in a visually unified and aesthetically pleasing final composition. Extensive evaluations demonstrate that the proposed approach yields significantly more natural stylistic transformations compared to conventional methods. The method involves semantic segmentation using DeepLabv3+ to generate a mask that separates the foreground (e.g., person) from the background (e.g., scenic view, road, room). Boundary optimization is then applied to refine the segmentation mask, ensuring precise boundaries for independent style and color application. The style transfer is applied to the background region, while color transfer is applied to the foreground region. The two altered regions are then seamlessly blended together using alpha blending, with the optimized segmentation mask providing a range of values between 0.0 and 1.0 suitable for this process. The paper also describes the use of the lαβ color space for color transfer, which aligns well with natural scene statistics and human visual perception. Principal Component Analysis (PCA) is employed in this space to capture dominant color variations, enabling the transfer of color characteristics from the style image to the foreground. The method is evaluated through extensive experiments, demonstrating its effectiveness in achieving a visually harmonious result where the artistic style from the reference image is seamlessly integrated into the background, while the foreground portraits retain their original color characteristics with rigorous fidelity. The proposed approach offers a concise and effective framework for applying and switching between style transfer algorithms, and has the potential to be extended to establish a correspondence between painted images and video frames, enabling the creation of stylized films. The method is scalable and efficient, with minimal computational overhead, and future research directions include extending its applicability beyond human segmentation to encompass diverse object categories and optimizing inference speed and efficiency.This paper presents a novel approach to regional style and color transfer, addressing the limitations of global style transfer methods that often produce unnatural results. The proposed method leverages object segmentation to isolate foreground objects within an image, applying style transfer to the background and color transfer to the foreground. The foreground elements are then carefully reintegrated into the style-transferred background, with color transfer applied prior to reintegration. Feathering techniques are used to achieve a seamless amalgamation of foreground and background, resulting in a visually unified and aesthetically pleasing final composition. Extensive evaluations demonstrate that the proposed approach yields significantly more natural stylistic transformations compared to conventional methods. The method involves semantic segmentation using DeepLabv3+ to generate a mask that separates the foreground (e.g., person) from the background (e.g., scenic view, road, room). Boundary optimization is then applied to refine the segmentation mask, ensuring precise boundaries for independent style and color application. The style transfer is applied to the background region, while color transfer is applied to the foreground region. The two altered regions are then seamlessly blended together using alpha blending, with the optimized segmentation mask providing a range of values between 0.0 and 1.0 suitable for this process. The paper also describes the use of the lαβ color space for color transfer, which aligns well with natural scene statistics and human visual perception. Principal Component Analysis (PCA) is employed in this space to capture dominant color variations, enabling the transfer of color characteristics from the style image to the foreground. The method is evaluated through extensive experiments, demonstrating its effectiveness in achieving a visually harmonious result where the artistic style from the reference image is seamlessly integrated into the background, while the foreground portraits retain their original color characteristics with rigorous fidelity. The proposed approach offers a concise and effective framework for applying and switching between style transfer algorithms, and has the potential to be extended to establish a correspondence between painted images and video frames, enabling the creation of stylized films. The method is scalable and efficient, with minimal computational overhead, and future research directions include extending its applicability beyond human segmentation to encompass diverse object categories and optimizing inference speed and efficiency.
Reach us at info@study.space