11 Nov 2016 | Jiwon Kim, Jung Kwon Lee and Kyoung Mu Lee
The paper introduces a Deeply-Recursive Convolutional Network (DRCN) for image super-resolution (SR). The DRCN is designed to improve performance by increasing the receptive field without introducing new parameters. However, training such a network is challenging due to exploding and vanishing gradients. To address this, the authors propose two extensions: recursive-supervision and skip-connection. Recursive-supervision involves supervising all recursions to smooth gradients and reduce overfitting, while skip-connection helps retain input information during recursions. The method outperforms previous methods in common benchmarks, demonstrating state-of-the-art performance. The paper also discusses related work, including single-image SR and recursive neural networks, and provides a detailed mathematical formulation and experimental results.The paper introduces a Deeply-Recursive Convolutional Network (DRCN) for image super-resolution (SR). The DRCN is designed to improve performance by increasing the receptive field without introducing new parameters. However, training such a network is challenging due to exploding and vanishing gradients. To address this, the authors propose two extensions: recursive-supervision and skip-connection. Recursive-supervision involves supervising all recursions to smooth gradients and reduce overfitting, while skip-connection helps retain input information during recursions. The method outperforms previous methods in common benchmarks, demonstrating state-of-the-art performance. The paper also discusses related work, including single-image SR and recursive neural networks, and provides a detailed mathematical formulation and experimental results.