Deep Scale-spaces: Equivariance Over Scale

Open Access
Authors
Publication date 2020
Host editors
  • H. Wallach
  • H. Larochelle
  • A. Beygelzimer
  • F. d'Alché-Buc
  • E. Fox
  • R. Garnett
Book title 32nd Conference on Neural Information Processing Systems (NeurIPS 2019)
Book subtitle Vancouver, Canada, 8-14 December 2019
ISBN
  • 9781713807933
Series Advances in Neural Information Processing Systems
Event 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Volume | Issue number 10
Pages (from-to) 7334-7346
Publisher San Diego, CA: Neural Information Processing Systems Foundation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
We introduce deep scale-spaces (DSS), a generalization of convolutional neural networks, exploiting the scale symmetry structure of conventional image recognition tasks. Put plainly, the class of an image is invariant to the scale at which it is viewed. We construct scale equivariant cross-correlations based on a principled extension of convolutions, grounded in the theory of scale-spaces and semigroups. As a very basic operation, these cross-correlations can be used in almost any modern deep learning architecture in a plug-and-play manner. We demonstrate our networks on the Patch Camelyon and Cityscapes datasets, to prove their utility and perform introspective studies to further understand their properties.
Document type Conference contribution
Note Running title: 33rd Conference on Neural Information Processing Systems (NeurIPS 2019). - With supplemental file.
Language English
Published at https://papers.nips.cc/paper/2019/hash/f04cd7399b2b0128970efb6d20b5c551-Abstract.html
Other links http://www.proceedings.com/53719.html
Downloads
Supplementary materials
Permalink to this page
Back