CascadedGaze: Efficiency in Global Context Extraction for Image Restoration

CascadedGaze: Efficiency in Global Context Extraction for Image Restoration

05/2024 | Amirhosein Ghasemabadi, Muhammad Kamran Janjua, Mohammad Salameh, Chunhua Zhou, Fengyu Sun, Di Niu
The paper introduces CascadedGaze Network (CGNet), an encoder-decoder architecture designed to efficiently capture global context for image restoration tasks. CGNet employs a novel module called Global Context Extractor (GCE), which uses small kernel convolutions across convolutional layers to learn global dependencies without relying on self-attention. This approach balances performance and computational efficiency, achieving competitive results on synthetic image denoising and single image deblurring tasks while outperforming previous methods on real image denoising. The GCE module is composed of cascaded convolutional layers that progressively capture local and global context, followed by a Range Fuser to aggregate these contexts. The paper also includes a detailed experimental setup, results, and ablation studies to validate the effectiveness of CGNet and the GCE module.The paper introduces CascadedGaze Network (CGNet), an encoder-decoder architecture designed to efficiently capture global context for image restoration tasks. CGNet employs a novel module called Global Context Extractor (GCE), which uses small kernel convolutions across convolutional layers to learn global dependencies without relying on self-attention. This approach balances performance and computational efficiency, achieving competitive results on synthetic image denoising and single image deblurring tasks while outperforming previous methods on real image denoising. The GCE module is composed of cascaded convolutional layers that progressively capture local and global context, followed by a Range Fuser to aggregate these contexts. The paper also includes a detailed experimental setup, results, and ablation studies to validate the effectiveness of CGNet and the GCE module.
Reach us at info@study.space