Learning to learn with less and less

Open Access
Authors
Supervisors
Cosupervisors
Award date 19-11-2025
Number of pages 197
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
This thesis investigates how artificial intelligence can learn efficiently with limited data, bridging the gap between data-hungry foundation models and human-like adaptability. While large-scale models such as GPT-4 and Gemini achieve impressive generalization through massive pre-training, their fine-tuning performance deteriorates in data-scarce regimes. Inspired by human cognition—particularly episodic and semantic memory—this research explores how AI systems can learn to learn, reusing prior experience to accelerate adaptation.
The thesis develops a unified framework across three complementary dimensions: batch learning, which enhances stability through adaptive batch statistics in low-data settings; memory learning, which incorporates semantic and episodic memory mechanisms to enable rapid task adaptation; and generative learning, which leverages variational inference and diffusion-based synthesis to expand data diversity and robustness. These principles are applied to practical challenges including domain generalization, long-tailed recognition, and vision-language modeling.
By integrating meta-learning, memory-based architectures, and generative modeling, this work provides a cohesive pathway toward data-efficient AI systems capable of human-like generalization and continual adaptation across dynamic environments.
Document type PhD thesis
Language English
Downloads
Permalink to this page
cover
Back