Coordinate Descent for SLOPE

Johan Larsson, Quentin Klopfenstein, Mathurin Massias, Jonas Wallin

Research output: Chapter in Book/Report/Conference proceedingPaper in conference proceedingpeer-review

Abstract

The lasso is the most famous sparse regression and feature selection method. One reason for its popularity is the speed at which the underlying optimization problem can be solved. Sorted L-One Penalized Estimation (SLOPE) is a generalization of the lasso with appealing statistical properties. In spite of this, the method has not yet reached widespread interest. A major reason for this is that current software packages that fit SLOPE rely on algorithms that perform poorly in high dimensions. To tackle this issue, we propose a new fast algorithm to solve the SLOPE optimization problem, which combines proximal gradient descent and proximal coordinate descent steps. We provide new results on the directional derivative of the SLOPE penalty and its related SLOPE thresholding operator, as well as provide convergence guarantees for our proposed solver. In extensive benchmarks on simulated and real data, we demonstrate our method's performance against a long list of competing algorithms.

Original languageEnglish
Title of host publicationProceedings of the 26th international conference on artificial intelligence and statistics
EditorsFrancisco Ruiz, Jennifer Dy, Jan-Willem van de Meent
Pages4802-4821
Number of pages20
Volume206
Publication statusPublished - 2023
Event26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023 - Valencia, Spain
Duration: 2023 Apr 252023 Apr 27

Publication series

NameProceedings of Machine Learning Research

Conference

Conference26th International Conference on Artificial Intelligence and Statistics, AISTATS 2023
Country/TerritorySpain
CityValencia
Period2023/04/252023/04/27

Subject classification (UKÄ)

  • Probability Theory and Statistics

Fingerprint

Dive into the research topics of 'Coordinate Descent for SLOPE'. Together they form a unique fingerprint.

Cite this