23 Feb 2019 | Namhoon Lee, Thalaiyasingam Ajanthan & Philip H. S. Torr
The paper introduces a novel approach called Single-Shot Network Pruning (SNIP) for pruning large neural networks to reduce their computational complexity while maintaining performance. Unlike existing methods that require iterative optimization and complex pruning schedules, SNIP prunes the network once at initialization before training. This is achieved by introducing a saliency criterion based on connection sensitivity, which identifies structurally important connections for the given task. The method is simple, versatile, and interpretable, making it applicable to various network architectures including convolutional, residual, and recurrent networks. SNIP achieves extremely sparse networks with minimal accuracy loss on datasets like MNIST, CIFAR-10, and Tiny-ImageNet. The paper also demonstrates that the retained connections are indeed relevant to the task, providing interpretability and validation of the pruning process.The paper introduces a novel approach called Single-Shot Network Pruning (SNIP) for pruning large neural networks to reduce their computational complexity while maintaining performance. Unlike existing methods that require iterative optimization and complex pruning schedules, SNIP prunes the network once at initialization before training. This is achieved by introducing a saliency criterion based on connection sensitivity, which identifies structurally important connections for the given task. The method is simple, versatile, and interpretable, making it applicable to various network architectures including convolutional, residual, and recurrent networks. SNIP achieves extremely sparse networks with minimal accuracy loss on datasets like MNIST, CIFAR-10, and Tiny-ImageNet. The paper also demonstrates that the retained connections are indeed relevant to the task, providing interpretability and validation of the pruning process.