Low-rank inducing norms with optimality interpretations

Christian Grussler, Pontus Giselsson

Forskningsoutput: TidskriftsbidragArtikel i vetenskaplig tidskriftPeer review

Sammanfattning

Optimization problems with rank constraints appear in many diverse fields such as control, machine learning, and image analysis. Since the rank constraint is nonconvex, these problems are often approximately solved via convex relaxations. Nuclear norm regularization is the prevailing convexifying technique for dealing with these types of problem. This paper introduces a family of low-rank inducing norms and regularizers which include the nuclear norm as a special case. A posteriori guarantees on solving an underlying rank constrained optimization problem with these convex relaxations are provided. We evaluate the performance of the low-rank inducing norms on three matrix completion problems. In all examples, the nuclear norm heuristic is outperformed by convex relaxations based on other low-rank inducing norms. For two of the problems there exist low-rank inducing norms that succeed in recovering the partially unknown matrix, while the nuclear norm fails. These low-rank inducing norms are shown to be representable as semidefinite programs. Moreover, these norms have cheaply computable proximal mappings, which make it possible to also solve problems of large size using first-order methods.

Originalspråkengelska
Sidor (från-till)3057-3078
Antal sidor22
TidskriftSIAM Journal on Optimization
Volym28
Nummer4
DOI
StatusPublished - 2018

Ämnesklassifikation (UKÄ)

  • Reglerteknik

Fingeravtryck

Utforska forskningsämnen för ”Low-rank inducing norms with optimality interpretations”. Tillsammans bildar de ett unikt fingeravtryck.

Citera det här