Evaluating Visual Analytics for Health Informatics Applications: A Systematic Review from the AMIA VIS Working Group Task Force on Evaluation

Danny T. Y. Wu, Annie T. Chen, John D. Manning, Gal Levy-Fix, Una Backonja, David Borland, Jesus J. Caban, Dawn Dowding, Harry Hochheiser, Vadim Kagan, Swaminathan Kandaswamy, Manish Kumar, Alexis Nunez, Eric Pan, David Gotz

Research output: Contribution to journalArticlepeer-review

30 Downloads (Pure)

Abstract

Objective: This article reports results from a systematic review of literature related to the evaluation of data visualizations and visual analytics technologies within the health informatics domain. The aims of this review were to (1) characterize the variety of evaluation methods that have been adopted within the health informatics community and (2) identify best practices for future research in this area.

Methods: A systematic literature review was conducted following PRISMA guidelines. PubMed searches were conducted in February 2017 using three groups of search terms representing key concepts of interest: health care settings, visualization, and evaluation of visualizations. References were also screened for eligibility. Data were extracted from included studies and analyzed using a PICOS framework: Participants, Interventions, Comparators, Outcomes, and Study Design.

Results: After title, abstract, and full text screening, 76 publications met the review criteria. Publications varied across all PICOS dimensions. The most common audience was healthcare providers (n=43), and the most common data gathering methods were direct observation (n=30) and surveys (n=27). About half of the publications focused on static, concentrated views of data with visuals (n=36). Evaluations were heterogeneous regarding setting and measurements used.

Discussion: When evaluating data visualizations and visual analytics technologies, a variety of approaches have been used. Usability measures were used most often in early (prototype) implementations, whereas clinical outcomes were most common in evaluations of operationally-deployed systems. These findings suggest opportunities for both (1) expanding evaluation practices, and (2) innovation with respect to evaluation methods for data visualizations and visual analytics technologies across health settings.
Original languageEnglish
JournalJournal of the American Medical Informatics Association
Early online date14 Feb 2019
DOIs
Publication statusPublished - 2019

Keywords

  • Review (V02.600.500)
  • Evaluation Studies (V03.400)
  • (MeSH terms)

Fingerprint

Dive into the research topics of 'Evaluating Visual Analytics for Health Informatics Applications: A Systematic Review from the AMIA VIS Working Group Task Force on Evaluation'. Together they form a unique fingerprint.

Cite this