January 2010, Volume 33, Issue 1 | Jerome Friedman, Trevor Hastie, Rob Tibshirani
The paper develops fast algorithms for estimating generalized linear models with convex penalties, including linear regression, two-class logistic regression, and multinomial regression. The penalties considered are $\ell_1$ (lasso), $\ell_2$ (ridge regression), and mixtures of these (elastic net). The algorithms use cyclical coordinate descent along a regularization path, which can handle large datasets and sparse features efficiently. The methods are significantly faster than competing algorithms in comparative timings. The paper also discusses the implementation details, including naive and covariance updates, sparse updates, weighted updates, and pathwise coordinate descent. The authors provide an R package, glmnet, and demonstrate its performance through simulations and real data examples. The paper concludes with a discussion on selecting tuning parameters and references to related work.The paper develops fast algorithms for estimating generalized linear models with convex penalties, including linear regression, two-class logistic regression, and multinomial regression. The penalties considered are $\ell_1$ (lasso), $\ell_2$ (ridge regression), and mixtures of these (elastic net). The algorithms use cyclical coordinate descent along a regularization path, which can handle large datasets and sparse features efficiently. The methods are significantly faster than competing algorithms in comparative timings. The paper also discusses the implementation details, including naive and covariance updates, sparse updates, weighted updates, and pathwise coordinate descent. The authors provide an R package, glmnet, and demonstrate its performance through simulations and real data examples. The paper concludes with a discussion on selecting tuning parameters and references to related work.