Hyperparameter-selection for sparse regression: A probablistic approach

Forskningsoutput: Kapitel i bok/rapport/Conference proceedingKonferenspaper i proceedingPeer review


The choice of hyperparameter(s) notably affects the support recovery in LASSO-like sparse regression problems, acting as an implicit model order selection. Parameters are typically selected using cross-validation or various ad hoc approaches. These often overestimates the resulting model order, aiming to minimize the prediction error rather than maximizing the support recovery. In this work, we propose a probabilistic approach to selecting hyperparameters in order to maximize the support recovery, quantifying the type I error (false positive rate) using extreme value analysis, such that the regularization level is selected as an appropriate quantile. By instead solving the scaled LASSO problem, the proposed choice of hyperparameter becomes almost independent of the noise variance. Simulation examples illustrate how the proposed method outperforms both cross-validation and the Bayesian Information Criterion in terms of computational complexity and support recovery.

Titel på värdpublikationConference Record of 51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
FörlagIEEE - Institute of Electrical and Electronics Engineers Inc.
Antal sidor5
ISBN (elektroniskt)9781538618233
StatusPublished - 2018 apr. 10
Evenemang51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017 - Pacific Grove, USA
Varaktighet: 2017 okt. 292017 nov. 1


Konferens51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
OrtPacific Grove

Ämnesklassifikation (UKÄ)

  • Sannolikhetsteori och statistik


Utforska forskningsämnen för ”Hyperparameter-selection for sparse regression: A probablistic approach”. Tillsammans bildar de ett unikt fingeravtryck.

Citera det här