Likelihood-Free Inference with Deep Gaussian Processes

Alexander Aushev, Henri Pesonen, Markus Heinonen, Jukka Corander, Samuel Kaski

Research output: Contribution to journalArticlepeer-review

Abstract

Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations. The current state-of-the-art performance for this task has been achieved by Bayesian Optimization with Gaussian Processes (GPs). While this combination works well for unimodal target distributions, it is restricting the flexibility and applicability of Bayesian Optimization for accelerating likelihood-free inference more generally. This problem is addressed by proposing a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions. The experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases. At the same time, DGPs generally require much fewer data to achieve the same level of performance as neural density and kernel mean embedding alternatives. This confirms that DGPs as surrogate models can extend the applicability of Bayesian Optimization for likelihood-free inference (BOLFI), while only adding computational overhead that remains negligible for computationally intensive simulators.

Original languageEnglish
Pages (from-to)1-19
Number of pages19
JournalComputational Statistics & Data Analysis
Volume174
DOIs
Publication statusPublished - Oct 2022

Keywords

  • Machine learning

Research Beacons, Institutes and Platforms

  • Institute for Data Science and AI
  • Digital Futures
  • Sustainable Futures

Fingerprint

Dive into the research topics of 'Likelihood-Free Inference with Deep Gaussian Processes'. Together they form a unique fingerprint.

Cite this