Learning useful representations with variational autoencoders

Open Access
Authors
Supervisors
Cosupervisors
Award date 19-03-2024
Number of pages 188
Organisations
  • Faculty of Science (FNWI) - Informatics Institute (IVI)
Abstract
Variational Autoencoders (VAEs) are one of the most well-known frameworks for combining probabilistic and deep learning approaches. Like many classical probabilistic models, VAEs often pursue two intertwined objectives. The first is to approximate the underlying distribution of the data, thereby enabling the generation of new samples that mirror those drawn from the true distribution. The second objective focuses on extracting valuable insights from the data. This PhD thesis focuses on the latter objective, investigating a variety of inductive biases to guide the model towards learning a useful representation from the data. The first form of inductive bias we explore is through the modification of objective function to impose certain constraints on the latent space, such as hierarchical independence, which is a desired property for learning a disentangled representation. The next inductive bias we examine is the incorporation of structure into the latent space of the model that is more closely aligned with the prior knowledge about the underlying data, as we demonstrate in the case of neural topic models. A similar inductive bias we experiment with is designing the latent space of the model such that its geometry matches the intrinsic manifold of the data. Beyond the exploration of inductive biases, we also propose an alternative approach inspired by energy-based models by defining a likelihood to have the form of an exponential family where the sufficient statistic is parameterized by a neural network. Finally, we investigate the generalization capability of VAEs to novel data by performing rate-distortion analysis on a variety of models with different levels of network capacity and difficulty of the generalization problem.
Document type PhD thesis
Language English
Downloads
Permalink to this page
cover
Back