AIC for the lasso in generalized linear models

Yoshiyuki Ninomiya, Shuichi Kawano

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

The Lasso is a popular regularization method that can simultaneously do estimation and model selection. It contains a regularization parameter, and several information criteria have been proposed for selecting its proper value.While any of them would assure consistency in model selection, we have no appropriate rule to choose between the criteria. Meanwhile, a finite correction to the AIC has been provided in a Gaussian regression setting. The finite correction is theoretically assured from the viewpoint not of the consistency but of minimizing the prediction error and does not have the above-mentioned difficulty. Our aim is to derive such a criterion for the Lasso in generalized linear models. Towards this aim, we derive a criterion from the original definition of the AIC, that is, an asymptotically unbiased estimator of the Kullback-Leibler divergence. This becomes the finite correction in the Gaussian regression setting, and so our criterion can be regarded as its generalization. Our criterion can be easily obtained and requires fewer computational tasks than does cross-validation, but simulation studies and real data analyses indicate that its performance is almost the same as or superior to that of cross-validation. Moreover, our criterion is extended for a class of other regularization methods.

Original languageEnglish
Pages (from-to)2537-2560
Number of pages24
JournalElectronic Journal of Statistics
Volume10
Issue number2
DOIs
Publication statusPublished - Jan 1 2016

Fingerprint

Lasso
Generalized Linear Model
Regularization Method
Cross-validation
Model Selection
Regression
Kullback-Leibler Divergence
Information Criterion
Unbiased estimator
Regularization Parameter
Prediction Error
Generalized linear model
Choose
Simulation Study
Regularization

All Science Journal Classification (ASJC) codes

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

AIC for the lasso in generalized linear models. / Ninomiya, Yoshiyuki; Kawano, Shuichi.

In: Electronic Journal of Statistics, Vol. 10, No. 2, 01.01.2016, p. 2537-2560.

Research output: Contribution to journalArticle

Ninomiya, Yoshiyuki ; Kawano, Shuichi. / AIC for the lasso in generalized linear models. In: Electronic Journal of Statistics. 2016 ; Vol. 10, No. 2. pp. 2537-2560.
@article{1f0111b9c36e4ae18ffb4cb14dec5f28,
title = "AIC for the lasso in generalized linear models",
abstract = "The Lasso is a popular regularization method that can simultaneously do estimation and model selection. It contains a regularization parameter, and several information criteria have been proposed for selecting its proper value.While any of them would assure consistency in model selection, we have no appropriate rule to choose between the criteria. Meanwhile, a finite correction to the AIC has been provided in a Gaussian regression setting. The finite correction is theoretically assured from the viewpoint not of the consistency but of minimizing the prediction error and does not have the above-mentioned difficulty. Our aim is to derive such a criterion for the Lasso in generalized linear models. Towards this aim, we derive a criterion from the original definition of the AIC, that is, an asymptotically unbiased estimator of the Kullback-Leibler divergence. This becomes the finite correction in the Gaussian regression setting, and so our criterion can be regarded as its generalization. Our criterion can be easily obtained and requires fewer computational tasks than does cross-validation, but simulation studies and real data analyses indicate that its performance is almost the same as or superior to that of cross-validation. Moreover, our criterion is extended for a class of other regularization methods.",
author = "Yoshiyuki Ninomiya and Shuichi Kawano",
year = "2016",
month = "1",
day = "1",
doi = "10.1214/16-EJS1179",
language = "English",
volume = "10",
pages = "2537--2560",
journal = "Electronic Journal of Statistics",
issn = "1935-7524",
publisher = "Institute of Mathematical Statistics",
number = "2",

}

TY - JOUR

T1 - AIC for the lasso in generalized linear models

AU - Ninomiya, Yoshiyuki

AU - Kawano, Shuichi

PY - 2016/1/1

Y1 - 2016/1/1

N2 - The Lasso is a popular regularization method that can simultaneously do estimation and model selection. It contains a regularization parameter, and several information criteria have been proposed for selecting its proper value.While any of them would assure consistency in model selection, we have no appropriate rule to choose between the criteria. Meanwhile, a finite correction to the AIC has been provided in a Gaussian regression setting. The finite correction is theoretically assured from the viewpoint not of the consistency but of minimizing the prediction error and does not have the above-mentioned difficulty. Our aim is to derive such a criterion for the Lasso in generalized linear models. Towards this aim, we derive a criterion from the original definition of the AIC, that is, an asymptotically unbiased estimator of the Kullback-Leibler divergence. This becomes the finite correction in the Gaussian regression setting, and so our criterion can be regarded as its generalization. Our criterion can be easily obtained and requires fewer computational tasks than does cross-validation, but simulation studies and real data analyses indicate that its performance is almost the same as or superior to that of cross-validation. Moreover, our criterion is extended for a class of other regularization methods.

AB - The Lasso is a popular regularization method that can simultaneously do estimation and model selection. It contains a regularization parameter, and several information criteria have been proposed for selecting its proper value.While any of them would assure consistency in model selection, we have no appropriate rule to choose between the criteria. Meanwhile, a finite correction to the AIC has been provided in a Gaussian regression setting. The finite correction is theoretically assured from the viewpoint not of the consistency but of minimizing the prediction error and does not have the above-mentioned difficulty. Our aim is to derive such a criterion for the Lasso in generalized linear models. Towards this aim, we derive a criterion from the original definition of the AIC, that is, an asymptotically unbiased estimator of the Kullback-Leibler divergence. This becomes the finite correction in the Gaussian regression setting, and so our criterion can be regarded as its generalization. Our criterion can be easily obtained and requires fewer computational tasks than does cross-validation, but simulation studies and real data analyses indicate that its performance is almost the same as or superior to that of cross-validation. Moreover, our criterion is extended for a class of other regularization methods.

UR - http://www.scopus.com/inward/record.url?scp=84988708751&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84988708751&partnerID=8YFLogxK

U2 - 10.1214/16-EJS1179

DO - 10.1214/16-EJS1179

M3 - Article

AN - SCOPUS:84988708751

VL - 10

SP - 2537

EP - 2560

JO - Electronic Journal of Statistics

JF - Electronic Journal of Statistics

SN - 1935-7524

IS - 2

ER -