Learning to convolve: A generalized weight-tying approach
| Authors |
|
|---|---|
| Publication date | 2019 |
| Journal | Proceedings of Machine Learning Research |
| Event | 36th International Conference on Machine Learning, ICML 2019 |
| Volume | Issue number | 97 |
| Pages (from-to) | 1586-1595 |
| Organisations |
|
| Abstract |
Recent work (Cohen & Welling, 2016a) has shown that generalizations of convolutions, based on group theory, provide powerful inductive biases for learning. In these generalizations, filters are not only translated but can also be rotated, flipped, etc. However, coming up with exact models of how to rotate a 3 x 3 filter on a square pixel-grid is difficult.
In this paper, we learn how to transform filters for use in the group convolution, focussing on roto-translation. For this, we learn a filter basis and all rotated versions of that filter basis. Filters are then encoded by a set of rotation invariant coefficients. To rotate a filter, we switch the basis. We demonstrate we can produce feature maps with low sensitivity to input rotations, while achieving high performance on MNIST and CIFAR-10.
|
| Document type | Article |
| Note | 36th International Conference on Machine Learning (ICML 2019) : Long Beach, California, USA, 9-15 June 2019. - With supplementary file. - In print proceedings pp. 2854-2866. |
| Language | English |
| Published at | http://proceedings.mlr.press/v97/diaconu19a.html |
| Other links | http://www.proceedings.com/48979.html https://github.com/NichitaDiaconu/Learning-to-Convolve |
| Downloads |
diaconu19a
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |
