Variational Multi-Task Learning with Gumbel-Softmax Priors

Open Access
Authors
Publication date 2022
Host editors
  • M. Ranzato
  • A. Beygelzimer
  • Y. Dauphin
  • P.S. Liang
  • J. Wortman Vaughan
Book title 35th Conference on Neural Information Processing Systems (NeurIPS 2021)
Book subtitle online, 6-14 December 2021
ISBN (electronic)
  • 9781713845393
Series Advances in Neural Information Processing Systems
Event NeurIPS 2021
Volume | Issue number 25
Pages (from-to) 21031-21042
Publisher San Diego, CA: Neural Information Processing Systems Foundation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Multi-task learning aims to explore task relatedness to improve individual tasks, which is of particular significance in the challenging scenario that only limited data is available for each task. To tackle this challenge, we propose variational multi-task learning (VMTL), a general probabilistic inference framework for learning multiple related tasks. We cast multi-task learning as a variational Bayesian inference problem, in which task relatedness is explored in a unified manner by specifying priors. To incorporate shared knowledge into each task, we design the prior of a task to be a learnable mixture of the variational posteriors of other related tasks, which is learned by the Gumbel-Softmax technique. In contrast to previous methods, our VMTL can exploit task relatedness for both representations and classifiers in a principled way by jointly inferring their posteriors. This enables individual tasks to fully leverage inductive biases provided by related tasks, therefore improving the overall performance of all tasks. Experimental results demonstrate that the proposed VMTL is able to effectively tackle a variety of challenging multi-task learning settings with limited training data for both classification and regression. Our method consistently surpasses previous methods, including strong Bayesian approaches, and achieves state-of-the-art performance on five benchmark datasets.
Document type Conference contribution
Note With supplementary materials.
Language English
Published at https://papers.nips.cc/paper_files/paper/2021/hash/afd4836712c5e77550897e25711e1d96-Abstract.html
Other links https://www.proceedings.com/63069.html
Downloads
Supplementary materials
Permalink to this page
Back