Description
Topic: Prototype Softmax Cross Entropy: A New Perspective on Softmax Cross EntropyWhen: 14 June at 12.00-13.15
Speaker: Proff. Nicola Wolpert, Hochschule für Technik Stuttgart
Moderator: Maria Priisalu, PhD Student, Mathematics, Lund University
Where: Online - link by registration
Spoken language: English
Abstract
In the talk we consider supervised learning for image classification. Inspired by recent results in the field of supervised contrastive learning, we focus on the loss function for the feature encoder. We show that Softmax Cross Entropy (SCE) can be interpreted as a special kind of loss function in contrastive learning with prototypes. This insight provides a completely new perspective on cross entropy, allowing the derivation of a new generalized loss function, called Prototype Softmax Cross Entropy (PSCE), for use in supervised contrastive learning.
We prove both mathematically and experimentally that PSCE is superior to other loss functions in supervised contrastive learning. It only uses fixed prototypes, so no self-organizing part of contrastive learning is required, eliminating the memory bottleneck of previous solutions in supervised contrastive learning. PSCE can also be used equally successfully for both balanced and unbalanced data.
Prototype Softmax Cross Entropy: A New Perspective on Softmax Cross Entropy
Period | 2023 Jun 14 |
---|---|
Event type | Seminar |
Location | Lund, SwedenShow on map |
Degree of Recognition | National |
Documents & Links
Related content
-
Projects
-
Lund University AI Research
Project: Network