Sparse and Continuous Attention Mechanisms

Open Access
Authors
  • P. Aguiar
  • M. Figueiredo
Publication date 2021
Host editors
  • H. Larochelle
  • M. Ranzato
  • R. Hadsell
  • M.F. Balcan
  • H. Lin
Book title 34th Concerence on Neural Information Processing Systems (NeurIPS 2020)
Book subtitle online, 6-12 December 2020
ISBN
  • 9781713829546
Series Advances in Neural Information Processing Systems
Event Advances in Neural Information Processing Systems 2020
Volume | Issue number 26
Pages (from-to) 20989-21001
Number of pages 13
Publisher San Diego, CA: Neural Information Processing Systems Foundation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Exponential families are widely used in machine learning; they include many distributions in continuous and discrete domains (e.g., Gaussian, Dirichlet, Poisson, and categorical distributions via the softmax transformation). Distributions in each of these families have fixed support. In contrast, for finite domains, there has been recent work on sparse alternatives to softmax (e.g., sparsemax and alpha-entmax), which have varying support, being able to assign zero probability to irrelevant categories. These discrete sparse mappings have been used for improving interpretability of neural attention mechanisms. This paper expands that work in two directions: first, we extend alpha-entmax to continuous domains, revealing a link with Tsallis statistics and deformed exponential families. Second, we introduce continuous-domain attention mechanisms, deriving efficient gradient backpropagation algorithms for alpha in {1,2}. Experiments on attention-based text classification, machine translation, and visual question answering illustrate the use of continuous attention in 1D and 2D, showing that it allows attending to time intervals and compact regions.
Document type Conference contribution
Note With supplemental information.
Language English
Published at https://papers.neurips.cc/paper/2020/hash/f0b76267fbe12b936bd65e203dc675c1-Abstract.html
Other links https://www.proceedings.com/59066.html
Downloads
Supplementary materials
Permalink to this page
Back