Optimally-weighted Estimators of the Maximum Mean Discrepancy for Likelihood-Free Inference

Ayush Bharti, Masha Naslidnyk, Oscar Key, Samuel Kaski, Francois Xavier Briol

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Likelihood-free inference methods typically make use of a distance between simulated and real data. A common example is the maximum mean discrepancy (MMD), which has previously been used for approximate Bayesian computation, minimum distance estimation, generalised Bayesian inference, and within the nonparametric learning framework. The MMD is commonly estimated at a root-m rate, where m is the number of simulated samples. This can lead to significant computational challenges since a large m is required to obtain an accurate estimate, which is crucial for parameter estimation. In this paper, we propose a novel estimator for the MMD with significantly improved sample complexity. The estimator is particularly well suited for computationally expensive smooth simulators with low- to mid-dimensional inputs. This claim is supported through both theoretical results and an extensive simulation study on benchmark simulators.
Original languageEnglish
Title of host publicationProceedings of the 40th International Conference on Machine Learning
PublisherJournal of Machine Learning Research
Pages2289-2312
Number of pages24
Publication statusPublished - Jul 2023
EventInternational Conference on Machine Learning -
Duration: 23 Jul 202329 Jul 2023

Conference

ConferenceInternational Conference on Machine Learning
Period23/07/2329/07/23

Fingerprint

Dive into the research topics of 'Optimally-weighted Estimators of the Maximum Mean Discrepancy for Likelihood-Free Inference'. Together they form a unique fingerprint.

Cite this