sigmoidF1: A Smooth F1 Score Surrogate Loss for Multilabel Classification

Open Access
Authors
Publication date 09-2022
Journal Transactions on Machine Learning Research
Article number 148
Volume | Issue number 2022
Number of pages 28
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Multilabel classification is the task of attributing multiple labels to examples via predictions. Current models formulate a reduction of the multilabel setting into either multiple binary classifications or multiclass classification, allowing for the use of existing loss functions (sigmoid, cross-entropy, logistic, etc.). These multilabel classification reductions do not accommodate for the prediction of varying numbers of labels per example. Moreover, the loss functions are distant estimates of the performance metrics. We propose sigmoidF1, a loss function that is an approximation of the F1 score that (i) is smooth and tractable for stochastic gradient descent, (ii) naturally approximates a multilabel metric, and (iii) estimates both label suitability and label counts. We show that any confusion matrix metric can be formulated with a smooth surrogate. We evaluate the proposed loss function on text and image datasets, and with a variety of metrics, to account for the complexity of multilabel classification evaluation. sigmoidF1 outperforms other loss functions on one text and two image datasets over several metrics. These results show the effectiveness of using inference-time metrics as loss functions for non-trivial classification problems like multilabel classification.
Document type Article
Language English
Published at https://doi.org/10.48550/arXiv.2108.10566
Published at https://openreview.net/forum?id=gvSHaaD2wQ
Other links http://jmlr.org/tmlr/papers/ https://github.com/gabriben/metrics-as-losses
Downloads
148_sigmoidf1_a_smooth_f1_score_su (Final published version)
Permalink to this page
Back