Hyperparameter-selection for sparse regression: A probablistic approach

Forskningsoutput: Kapitel i bok/rapport/Conference proceedingKonferenspaper i proceedingPeer review

Sammanfattning

The choice of hyperparameter(s) notably affects the support recovery in LASSO-like sparse regression problems, acting as an implicit model order selection. Parameters are typically selected using cross-validation or various ad hoc approaches. These often overestimates the resulting model order, aiming to minimize the prediction error rather than maximizing the support recovery. In this work, we propose a probabilistic approach to selecting hyperparameters in order to maximize the support recovery, quantifying the type I error (false positive rate) using extreme value analysis, such that the regularization level is selected as an appropriate quantile. By instead solving the scaled LASSO problem, the proposed choice of hyperparameter becomes almost independent of the noise variance. Simulation examples illustrate how the proposed method outperforms both cross-validation and the Bayesian Information Criterion in terms of computational complexity and support recovery.

Originalspråkengelska
Titel på värdpublikationConference Record of 51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
FörlagIEEE - Institute of Electrical and Electronics Engineers Inc.
Sidor853-857
Antal sidor5
Volym2017-October
ISBN (elektroniskt)9781538618233
DOI
StatusPublished - 2018 apr. 10
Evenemang51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017 - Pacific Grove, USA
Varaktighet: 2017 okt. 292017 nov. 1

Konferens

Konferens51st Asilomar Conference on Signals, Systems and Computers, ACSSC 2017
Land/TerritoriumUSA
OrtPacific Grove
Period2017/10/292017/11/01

Ämnesklassifikation (UKÄ)

  • Sannolikhetsteori och statistik

Fingeravtryck

Utforska forskningsämnen för ”Hyperparameter-selection for sparse regression: A probablistic approach”. Tillsammans bildar de ett unikt fingeravtryck.

Citera det här