Deep ordinal regression with label diversity

Axel Berg, Magnus Oskarsson, Mark O'Connor

Forskningsoutput: Kapitel i bok/rapport/Conference proceedingKonferenspaper i proceedingPeer review


Regression via classification (RvC) is a common method used for regression problems in deep learning, where the target variable belongs to a set of continuous values. By discretizing the target into a set of non-overlapping classes, it has been shown that training a classifier can improve neural network accuracy compared to using a standard regression approach. However, it is not clear how the set of discrete classes should be chosen and how it affects the overall solution. In this work, we propose that using several discrete data representations simultaneously can improve neural network learning compared to a single representation. Our approach is end-to-end differentiable and can be added as a simple extension to conventional learning methods, such as deep neural networks. We test our method on three challenging tasks and show that our method reduces the prediction error compared to a baseline RvC approach while maintaining a similar model complexity.
Titel på värdpublikation2020 25th International Conference on Pattern Recognition (ICPR)
FörlagIEEE - Institute of Electrical and Electronics Engineers Inc.
Antal sidor8
ISBN (elektroniskt)978-1-7281-8808-9
StatusPublished - 2021
Evenemang2020 25th International Conference on Pattern Recognition - Virtual, Milan, Italien
Varaktighet: 2021 jan. 102021 jan. 15
Konferensnummer: 25


NamnInternational Conference on Pattern Recognition
ISSN (tryckt)1051-4651


Konferens2020 25th International Conference on Pattern Recognition
Förkortad titelICPR 2020

Ämnesklassifikation (UKÄ)

  • Datorseende och robotik (autonoma system)


Utforska forskningsämnen för ”Deep ordinal regression with label diversity”. Tillsammans bildar de ett unikt fingeravtryck.

Citera det här