Guided Image Filtering

Guided Image Filtering

2010 | Kaiming He, Jian Sun, and Xiaoou Tang
This paper introduces a novel explicit image filter called the guided filter. Derived from a local linear model, the guided filter generates filtering output by considering the content of a guidance image, which can be the input image itself or another image. It preserves edges like the popular bilateral filter but performs better near edges. It is also related to the matting Laplacian matrix, making it a more general concept than a smoothing operator. The guided filter has a fast, non-approximate linear-time algorithm with computational complexity independent of the filtering kernel size. It is effective and efficient in various applications including noise reduction, detail smoothing/enhancement, HDR compression, image matting/feathering, haze removal, and joint upsampling. The guided filter is defined as a linear transform of the guidance image. It uses a local linear model between the guidance image and the filter output. The filter output is computed by minimizing the difference between the output and the input image. The linear coefficients are determined through linear regression. The filter output is then computed by averaging the possible values of the output across all local windows. The guided filter has edge-preserving smoothing properties and avoids gradient reversal artifacts that may appear in detail enhancement and HDR compression. It is also related to the matting Laplacian matrix, which provides new insights into the guided filter and inspires new applications. The guided filter has an O(N) time exact algorithm, which is faster and more efficient than the bilateral filter. It can be applied to color guidance images and is effective in various applications including image matting, haze removal, and joint upsampling. The guided filter is efficient and effective in a wide range of computer vision and graphics applications.This paper introduces a novel explicit image filter called the guided filter. Derived from a local linear model, the guided filter generates filtering output by considering the content of a guidance image, which can be the input image itself or another image. It preserves edges like the popular bilateral filter but performs better near edges. It is also related to the matting Laplacian matrix, making it a more general concept than a smoothing operator. The guided filter has a fast, non-approximate linear-time algorithm with computational complexity independent of the filtering kernel size. It is effective and efficient in various applications including noise reduction, detail smoothing/enhancement, HDR compression, image matting/feathering, haze removal, and joint upsampling. The guided filter is defined as a linear transform of the guidance image. It uses a local linear model between the guidance image and the filter output. The filter output is computed by minimizing the difference between the output and the input image. The linear coefficients are determined through linear regression. The filter output is then computed by averaging the possible values of the output across all local windows. The guided filter has edge-preserving smoothing properties and avoids gradient reversal artifacts that may appear in detail enhancement and HDR compression. It is also related to the matting Laplacian matrix, which provides new insights into the guided filter and inspires new applications. The guided filter has an O(N) time exact algorithm, which is faster and more efficient than the bilateral filter. It can be applied to color guidance images and is effective in various applications including image matting, haze removal, and joint upsampling. The guided filter is efficient and effective in a wide range of computer vision and graphics applications.
Reach us at info@study.space
Understanding Guided Image Filtering