TY - JOUR

T1 - Approximate Laplace importance sampling for the estimation of expected Shannon information gain in high-dimensional Bayesian design for nonlinear models

AU - Englezou, Yiolanda

AU - Waite, Timothy

AU - Woods, David C.

PY - 2022/9/30

Y1 - 2022/9/30

N2 - One of the major challenges in Bayesian optimal design is to approximate the expected utility function in an accurate and computationally efficient manner. We focus on Shannon information gain, one of the most widely used utilities when the experimental goal is parameter inference. We compare the performance of various methods for approximating expected Shannon information gain in common nonlinear models from the statistics literature, with a particular emphasis on Laplace Importance Sampling (LIS) and approximate Laplace Importance Sampling (ALIS), a new method that aims to reduce the computational cost of LIS. Specifically, in order to centre the importance distributions LIS requires computation of the posterior mode for each of a large number of simulated possibilities for the response vector. ALIS substantially reduces the amount of numerical optimization that is required, in some cases eliminating all optimization, by centering the importance distributions on the data-generating parameter values wherever possible. Both methods are thoroughly compared with existing approximations including Double Loop Monte Carlo, nested importance sampling, and Laplace approximation. It is found that LIS and ALIS both give an efficient trade-off between mean squared error and computational cost for utility estimation, and ALIS can be up to 70% cheaper than LIS. Usually ALIS gives an approximation that is cheaper but less accurate than LIS, while still being efficient, giving a useful addition to the suite of efficient methods. However, we observed one case where ALIS is both cheaper and more accurate. In addition, for the first time we show that LIS and ALIS yield superior designs to existing methods in problems with large numbers of model parameters when combined with the approximate co-ordinate exchange algorithm for design optimization.

AB - One of the major challenges in Bayesian optimal design is to approximate the expected utility function in an accurate and computationally efficient manner. We focus on Shannon information gain, one of the most widely used utilities when the experimental goal is parameter inference. We compare the performance of various methods for approximating expected Shannon information gain in common nonlinear models from the statistics literature, with a particular emphasis on Laplace Importance Sampling (LIS) and approximate Laplace Importance Sampling (ALIS), a new method that aims to reduce the computational cost of LIS. Specifically, in order to centre the importance distributions LIS requires computation of the posterior mode for each of a large number of simulated possibilities for the response vector. ALIS substantially reduces the amount of numerical optimization that is required, in some cases eliminating all optimization, by centering the importance distributions on the data-generating parameter values wherever possible. Both methods are thoroughly compared with existing approximations including Double Loop Monte Carlo, nested importance sampling, and Laplace approximation. It is found that LIS and ALIS both give an efficient trade-off between mean squared error and computational cost for utility estimation, and ALIS can be up to 70% cheaper than LIS. Usually ALIS gives an approximation that is cheaper but less accurate than LIS, while still being efficient, giving a useful addition to the suite of efficient methods. However, we observed one case where ALIS is both cheaper and more accurate. In addition, for the first time we show that LIS and ALIS yield superior designs to existing methods in problems with large numbers of model parameters when combined with the approximate co-ordinate exchange algorithm for design optimization.

U2 - 10.1007/s11222-022-10159-2

DO - 10.1007/s11222-022-10159-2

M3 - Article

VL - 32

JO - Statistics and Computing

JF - Statistics and Computing

SN - 0960-3174

M1 - 82

ER -