Learning a Single Convolutional Super-Resolution Network for Multiple Degradations

Learning a Single Convolutional Super-Resolution Network for Multiple Degradations

24 May 2018 | Kai Zhang, Wangmeng Zuo, Lei Zhang
The paper addresses the limitations of existing deep convolutional neural network (CNN)-based single image super-resolution (SISR) methods, which often assume bicubic downsampling and lack scalability to handle multiple degradations. To overcome these issues, the authors propose a general framework that uses a dimensionality stretching strategy to enable a single CNN to handle multiple and spatially variant degradations. The framework takes the low-resolution (LR) image, blur kernel, and noise level as inputs, allowing the network to learn from synthetic data and produce visually plausible results on both synthetic and real LR images. The proposed method, named SRMD, is evaluated on various datasets and shown to outperform state-of-the-art methods in terms of both quantitative metrics and visual quality. The results demonstrate the effectiveness and scalability of the proposed approach for practical SISR applications.The paper addresses the limitations of existing deep convolutional neural network (CNN)-based single image super-resolution (SISR) methods, which often assume bicubic downsampling and lack scalability to handle multiple degradations. To overcome these issues, the authors propose a general framework that uses a dimensionality stretching strategy to enable a single CNN to handle multiple and spatially variant degradations. The framework takes the low-resolution (LR) image, blur kernel, and noise level as inputs, allowing the network to learn from synthetic data and produce visually plausible results on both synthetic and real LR images. The proposed method, named SRMD, is evaluated on various datasets and shown to outperform state-of-the-art methods in terms of both quantitative metrics and visual quality. The results demonstrate the effectiveness and scalability of the proposed approach for practical SISR applications.
Reach us at info@study.space