TY - JOUR
T1 - Local dimension reduction of summary statistics for likelihood-free inference
AU - Sirén, Jukka
AU - Kaski, Samuel
N1 - Funding Information:
Open access funding provided by Aalto University. We acknowledge the computational resources provided by the Aalto Science-IT project and support by the Finnish Center for Artificial Intelligence Research FCAI. The work was supported by the Academy of Finland (Grants 319264, 294238 and 292334).
Publisher Copyright:
© 2019, The Author(s).
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/5/1
Y1 - 2020/5/1
N2 - Approximate Bayesian computation (ABC) and other likelihood-free inference methods have gained popularity in the last decade, as they allow rigorous statistical inference for complex models without analytically tractable likelihood functions. A key component for accurate inference with ABC is the choice of summary statistics, which summarize the information in the data, but at the same time should be low-dimensional for efficiency. Several dimension reduction techniques have been introduced to automatically construct informative and low-dimensional summaries from a possibly large pool of candidate summaries. Projection-based methods, which are based on learning simple functional relationships from the summaries to parameters, are widely used and usually perform well, but might fail when the assumptions behind the transformation are not satisfied. We introduce a localization strategy for any projection-based dimension reduction method, in which the transformation is estimated in the neighborhood of the observed data instead of the whole space. Localization strategies have been suggested before, but the performance of the transformed summaries outside the local neighborhood has not been guaranteed. In our localization approach the transformation is validated and optimized over validation datasets, ensuring reliable performance. We demonstrate the improvement in the estimation accuracy for localized versions of linear regression and partial least squares, for three different models of varying complexity.
AB - Approximate Bayesian computation (ABC) and other likelihood-free inference methods have gained popularity in the last decade, as they allow rigorous statistical inference for complex models without analytically tractable likelihood functions. A key component for accurate inference with ABC is the choice of summary statistics, which summarize the information in the data, but at the same time should be low-dimensional for efficiency. Several dimension reduction techniques have been introduced to automatically construct informative and low-dimensional summaries from a possibly large pool of candidate summaries. Projection-based methods, which are based on learning simple functional relationships from the summaries to parameters, are widely used and usually perform well, but might fail when the assumptions behind the transformation are not satisfied. We introduce a localization strategy for any projection-based dimension reduction method, in which the transformation is estimated in the neighborhood of the observed data instead of the whole space. Localization strategies have been suggested before, but the performance of the transformed summaries outside the local neighborhood has not been guaranteed. In our localization approach the transformation is validated and optimized over validation datasets, ensuring reliable performance. We demonstrate the improvement in the estimation accuracy for localized versions of linear regression and partial least squares, for three different models of varying complexity.
KW - Approximate Bayesian computation
KW - Dimension reduction
KW - Likelihood-free inference
KW - Summary statistics
UR - http://www.scopus.com/inward/record.url?scp=85074362445&partnerID=8YFLogxK
U2 - 10.1007/s11222-019-09905-w
DO - 10.1007/s11222-019-09905-w
M3 - Article
AN - SCOPUS:85074362445
SN - 0960-3174
VL - 30
SP - 559
EP - 570
JO - Statistics and Computing
JF - Statistics and Computing
IS - 3
ER -