ProtoDiff: Learning to Learn Prototypical Networks by Task-Guided Diffusion

Open Access
Authors
Publication date 2023
Host editors
  • A. Oh
  • T. Naumann
  • A. Globerson
  • K. Saenko
  • M. Hardt
  • S. Levine
Book title 37th Conference on Neural Information Processing Systems (NeurIPS 2023)
Book subtitle 10-16 December 2023, New Orleans, Louisana, USA
ISBN (electronic)
  • 9781713899921
Series Advances in Neural Information Processing Systems
Event 37th Conference on Neural Information Processing Systems (NeurIPS 2023)
Number of pages 19
Publisher Neural Information Processing Systems Foundation
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Prototype-based meta-learning has emerged as a powerful technique for addressing few-shot learning challenges. However, estimating a deterministic prototype using a simple average function from a limited number of examples remains a fragile process. To overcome this limitation, we introduce ProtoDiff, a novel framework that leverages a task-guided diffusion model during the meta-training phase to gradually generate prototypes, thereby providing efficient class representations. Specifically, a set of prototypes is optimized to achieve per-task prototype overfitting, enabling accurately obtaining the overfitted prototypes for individual tasks.Furthermore, we introduce a task-guided diffusion process within the prototype space, enabling the meta-learning of a generative process that transitions from a vanilla prototype to an overfitted prototype. ProtoDiff gradually generates task-specific prototypes from random noise during the meta-test stage, conditioned on the limited samples available for the new task. Furthermore, to expedite training and enhance ProtoDiff's performance, we propose the utilization of residual prototype learning, which leverages the sparsity of the residual prototype. We conduct thorough ablation studies to demonstrate its ability to accurately capture the underlying prototype distribution and enhance generalization. The new state-of-the-art performance on within-domain, cross-domain, and few-task few-shot classification further substantiates the benefit of ProtoDiff.
Document type Conference contribution
Language English
Published at https://papers.nips.cc/paper_files/paper/2023/hash/911dd89c81efc624c4e1c39381179505-Abstract-Conference.html
Other links https://doi.org/10.52202/075280
Downloads
Permalink to this page
Back