Dynamic conditional independence models and Markov chain Monte Carlo methods

Carlo Berzuini, Nicola G. Best, Walter R. Gilks, Cristiana Larizza

    Research output: Contribution to journalArticlepeer-review

    Abstract

    In dynamic statistical modeling situations, observations arise sequentially, causing the model to expand by progressive incorporation of new data items and new unknown parameters. For example, in clinical monitoring, patients and data arrive sequentially, and new patient-specific parameters are introduced with each new patient. Markov chain Monte Carlo (MCMC) might be used for continuous updating of the evolving posterior distribution, but would need to be restarted from scratch at each expansion stage. Thus MCMC methods are often too slow for real-time inference in dynamic contexts. By combining MCMC with importance resampling, we show how real-time sequential updating of posterior distributions can be effected. The proposed dynamic sampling algorithms use posterior samples from previous updating stages and exploit conditional independence between groups of parameters to allow samples of parameters no longer of interest to be discarded, such as when a patient dies or is discharged. We apply the methods to monitoring of heart transplant recipients during infection with cytomegalovirus.
    Original languageEnglish
    Pages (from-to)1403-1412
    Number of pages9
    JournalJournal of the American Statistical Association
    Volume92
    Issue number440
    Publication statusPublished - Dec 1997

    Keywords

    • Bayesian inference
    • Graphical model
    • Importance sampling
    • Metropolis-Hastings algorithm
    • Real-time forecasting
    • Sequential updating

    Fingerprint

    Dive into the research topics of 'Dynamic conditional independence models and Markov chain Monte Carlo methods'. Together they form a unique fingerprint.

    Cite this