Learning Task Relatedness in Multi-Task Learning for Images in Context

Authors
Publication date 2019
Book title ICMR '19
Book subtitle proceedings of the 2019 ACM International Conference on Multimedia Retrieval : June 10-13, 2019, Ottawa, ON, Canada
ISBN (electronic)
  • 9781450367653
Event 2019 ACM International Conference on Multimedia Retrieval
Pages (from-to) 78-86
Publisher New York, NY: The Association for Computing Machinery
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Multimedia applications often require concurrent solutions to multiple tasks. These tasks hold clues to each-others solutions, however as these relations can be complex this remains a rarely utilized property. When task relations are explicitly defined based on domain knowledge multi-task learning (MTL) offers such concurrent solutions, while exploiting relatedness between multiple tasks performed over the same dataset. In most cases however, this relatedness is not explicitly defined and the domain expert knowledge that defines it is not available. To address this issue, we introduce Selective Sharing, a method that learns the inter-task relatedness from secondary latent features while the model trains. Using this insight, we can automatically group tasks and allow them to share knowledge in a mutually beneficial way. We support our method with experiments on 5 datasets in classification, regression, and ranking tasks and compare to strong baselines and state-of-the-art approaches showing a consistent improvement in terms of accuracy and parameter counts. In addition, we perform an activation region analysis showing how Selective Sharing affects the learned representation.
Document type Conference contribution
Note This research is supported by the VISTORY project NWO award number 628.007.004.
Language English
Published at https://doi.org/10.1145/3323873.3325009
Permalink to this page
Back