A computationally fast alternative to cross-validation in penalized Gaussian graphical models

I. Vujacic, A. Abbruzzo, E. de Wit

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

We study the problem of selecting a regularization parameter in penalized Gaussian graphical models. When the goal is to obtain a model with good predictive power, cross-validation is the gold standard. We present a new estimator of Kullback–Leibler loss in Gaussian Graphical models which provides a computationally fast alternative to cross-validation. The estimator is obtained by approximating leave-one-out-cross-validation. Our approach is demonstrated on simulated data sets for various types of graphs. The proposed formula exhibits superior performance, especially in the typical small sample size scenario, compared to other available alternatives to cross-validation, such as Akaike's information criterion and Generalized approximate cross-validation. We also show that the estimator can be used to improve the performance of the Bayesian information criterion when the sample size is small.
Original languageEnglish
Pages (from-to)3628-3640
Number of pages13
JournalJournal of statistical computation and simulation
Volume85
Issue number18
Early online date13 Jan 2015
DOIs
Publication statusE-pub ahead of print - 13 Jan 2015

Bibliographical note

Gaussian graphical model, penalized estimation, Kullback–Leibler loss, cross-validation, generalized approximate cross-validation, information criteria

Fingerprint

Dive into the research topics of 'A computationally fast alternative to cross-validation in penalized Gaussian graphical models'. Together they form a unique fingerprint.

Cite this