Randomized trials with concurrent economic evaluations reported unrepresentatively large clinical effect sizes

Simon Gilbody, Peter Bower, Alex J. Sutton

    Research output: Contribution to journalArticlepeer-review


    Objective: To examine whether randomized economic evaluations report clinical effectiveness estimates that are unrepresentative of the totality of the research literature. Study Design and Setting: From 36 studies (12,294 patients) of enhanced care for depression, we compared pooled clinical effect sizes in studies with a concurrent economic evaluation to those in studies that did not publish a concurrent economic evaluation, using metaregression. Results: The pooled clinical effect size of studies publishing an economic evaluation was almost twice as large as that of studies that did not publish an economic evaluation (pooled standardized mean difference [SMD] in randomized controlled trials [RCTs] with an economic evaluation = 0.34; 95% confidence interval [CI] = 0.23-0.46; pooled SMD in RCTs without an economic evaluation = 0.17; 95% CI = 0.10-0.25). This difference was statistically significant (SMD between group difference = -0.17; 95% CI: -0.31 to -0.02; P = 0.02). Conclusion: Publication of an economic evaluation of enhanced care for depression was associated with a larger clinical effect size. Cost-effectiveness estimates should be interpreted with caution, and the representativeness of the clinical data on which they are based should always be considered. Further research is needed to explore this observed association and potential bias in other areas. © 2007 Elsevier Inc. All rights reserved.
    Original languageEnglish
    Pages (from-to)781-e10
    JournalJournal of Clinical Epidemiology
    Issue number8
    Publication statusPublished - Aug 2007


    • Depression
    • Economic evaluation
    • Meta-analysis
    • Publication bias
    • Randomized controlled trials
    • Systematic review


    Dive into the research topics of 'Randomized trials with concurrent economic evaluations reported unrepresentatively large clinical effect sizes'. Together they form a unique fingerprint.

    Cite this