Spectrally-normalized margin bounds for neural networks

Spectrally-normalized margin bounds for neural networks

5 Dec 2017 | Peter L. Bartlett*, Dylan J. Foster†, Matus Telgarsky‡
This paper presents a margin-based generalization bound for neural networks, scaling with their margin-normalized spectral complexity. The bound is empirically investigated using the AlexNet network trained with SGD on the MNIST and CIFAR10 datasets, both with original and random labels. The results show a strong correlation between the bound, the Lipschitz constants, and the excess risks, suggesting that SGD selects predictors whose complexity scales with the difficulty of the learning task. The bound is also sensitive to this complexity, as demonstrated by the decaying curve of normalized Lipschitz constants over training epochs. The paper contributes to the understanding of neural network generalization by providing a rigorous statement of the bound and empirical evidence supporting its effectiveness.This paper presents a margin-based generalization bound for neural networks, scaling with their margin-normalized spectral complexity. The bound is empirically investigated using the AlexNet network trained with SGD on the MNIST and CIFAR10 datasets, both with original and random labels. The results show a strong correlation between the bound, the Lipschitz constants, and the excess risks, suggesting that SGD selects predictors whose complexity scales with the difficulty of the learning task. The bound is also sensitive to this complexity, as demonstrated by the decaying curve of normalized Lipschitz constants over training epochs. The paper contributes to the understanding of neural network generalization by providing a rigorous statement of the bound and empirical evidence supporting its effectiveness.
Reach us at info@study.space
[slides] Spectrally-normalized margin bounds for neural networks | StudySpace