May 16, 2009 | Nicolai Meinshausen and Peter Bühlmann
The paper introduces Stability Selection, a method for variable selection and structure estimation in high-dimensional data. It combines subsampling with high-dimensional selection algorithms to provide finite sample control for false discovery rates and to improve the selection of regularisation parameters. The method is demonstrated through examples in variable selection and Gaussian graphical modelling, using both real and simulated data. Stability Selection is shown to be consistent for variable selection even when the necessary conditions for the original Lasso method are violated. The authors also introduce the Randomised Lasso, which achieves consistency in settings where the Lasso fails due to violations of the irrepresentable condition. The paper discusses the computational requirements and provides theoretical guarantees for the effectiveness of Stability Selection.The paper introduces Stability Selection, a method for variable selection and structure estimation in high-dimensional data. It combines subsampling with high-dimensional selection algorithms to provide finite sample control for false discovery rates and to improve the selection of regularisation parameters. The method is demonstrated through examples in variable selection and Gaussian graphical modelling, using both real and simulated data. Stability Selection is shown to be consistent for variable selection even when the necessary conditions for the original Lasso method are violated. The authors also introduce the Randomised Lasso, which achieves consistency in settings where the Lasso fails due to violations of the irrepresentable condition. The paper discusses the computational requirements and provides theoretical guarantees for the effectiveness of Stability Selection.