Generative Uncertainty in Diffusion Models
| Authors |
|
|---|---|
| Publication date | 2025 |
| Journal | Proceedings of Machine Learning Research |
| Event | 41st Conference on Uncertainty in Artificial Intelligence, UAI 2025 |
| Volume | Issue number | 286 |
| Pages (from-to) | 1837-1858 |
| Number of pages | 22 |
| Organisations |
|
| Abstract |
Diffusion models have recently driven significant breakthroughs in generative modeling. While state-of-the-art models produce high-quality samples on average, individual samples can still be low quality. Detecting such samples without human inspection remains a challenging task. To address this, we propose a Bayesian framework for estimating generative uncertainty of synthetic samples. We outline how to make Bayesian inference practical for large, modern generative models and introduce a new semantic likelihood (evaluated in the latent space of a feature extractor) to address the challenges posed by high-dimensional sample spaces. Through our experiments, we demonstrate that the proposed generative uncertainty effectively identifies poor-quality samples and significantly outperforms existing uncertainty-based methods. Notably, our Bayesian framework can be applied post-hoc to any pretrained diffusion or flow matching model (via the Laplace approximation), and we propose simple yet effective techniques to minimize its computational overhead during sampling.
|
| Document type | Article |
| Note | Proceedings of the Forty-first Conference on Uncertainty in Artificial Intelligence : 21-25 July 2025, Rio Othon Palace, Rio de Janeiro, Brazil |
| Language | English |
| Published at | https://proceedings.mlr.press/v286/jazbec25a.html |
| Other links | https://github.com/metodj/DIFF-UQ |
| Downloads |
jazbec25a
(Final published version)
|
| Permalink to this page | |