Appl Clin Inform 2023; 14(03): 465-469
DOI: 10.1055/a-2068-6699
Invited Editorial

Beyond Information Design: Designing Health Care Dashboards for Evidence-Driven Decision-Making

Sylvia J. Hysong
1   Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas, United States
2   Department of Medicine – Health Services Research Section, Baylor College of Medicine, Houston, Texas, United States
,
Christine Yang
1   Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas, United States
,
Janine Wong
1   Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas, United States
,
Melissa K. Knox
1   Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas, United States
2   Department of Medicine – Health Services Research Section, Baylor College of Medicine, Houston, Texas, United States
,
Patrick O'Mahen
1   Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas, United States
2   Department of Medicine – Health Services Research Section, Baylor College of Medicine, Houston, Texas, United States
,
Laura A. Petersen
1   Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, Houston, Texas, United States
2   Department of Medicine – Health Services Research Section, Baylor College of Medicine, Houston, Texas, United States
› Author Affiliations
Funding This study is supported by the Health Services Research and Development (grant numbers: CIN 13-413 and IIR 15-438).

With health care systems experiencing a deluge of exponentially expanding performance measures,[1] [2] dashboards (a graphical report of essential data relevant to a particular objective or process, such as the World Health Organization's Coronavirus Dashboard)[3] have become common to efficiently consolidate monitoring large numbers of clinical performance and related health care measures across multiple domains. As a result, overcrowded, ineffective dashboards abound.[4]

Much has been written about how to design more visually pleasing, navigable, and interpretable dashboards (known in human factors research as “information design”). Information design, however, assumes dashboard designers know what information needs to be presented and to whom. Further, dashboards assume a certain level of numeracy and graph literacy of their consumers to be effective.[5] Various frameworks to aid in information design have been proposed,[6] such as an ontology of performance summary display[7] and the BEhavior and Acceptance fRamework (BEAR)[8] for the design of clinical decision support systems, which consolidates the propositions of four frameworks (including, e.g., the Human, Organization, and Technology-fit framework [HOT-fit][9] and the Unified Theory of Acceptance and Use of Technology [UTAUT][10]) and 10 literature reviews to provide a comprehensive view of the factors needed in successfully designing and implementing clinical decision support systems and information dashboards. Frameworks such as these provide a comprehensive panorama of the domain of information design and implementation that researchers can use for expanding generalizable knowledge; however, such frameworks can be overwhelming and unwieldy for the field designer trying to solve a concrete problem for a health care practice by means of a dashboard. What is needed is a straightforward procedure or set of rules for identifying the content to be presented on a dashboard that will yield the most benefit for the problem in question. The literature on performance metric development can yield useful insight on this matter.

Hysong et al[11] proposed asking three simple questions to help decision-makers select appropriate quality improvement and performance metrics:

  1. What is the purpose of the metric?

  2. Who is the consumer (or audience) of the metric?

  3. Who is the intended subject, that is, who is being evaluated in this metric?

Just as lacking clear answers to these questions can hinder appropriate performance metric generation and selection, these three factors—unclear purpose, unclear or wrong consumer, and wrong subject—can pose barriers to successful dashboard design and implementation. Below we describe these in more detail and present a case example illustrating the use and benefits of this framework for dashboard design.

Supplementary Material



Publication History

Received: 01 November 2022

Accepted: 30 March 2023

Accepted Manuscript online:
04 April 2023

Article published online:
14 June 2023

© 2023. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Agency for Healthcare Research and Quality. . National Quality Measures Clearinghouse. 2010. Accessed May 17, 2010 at: http://www.qualitymeasures.ahrq.gov
  • 2 Hysong SJ, Francis J, Petersen LA. Motivating and engaging frontline providers in measuring and improving team clinical performance. BMJ Qual Saf 2019; 28 (05) 405-411
  • 3 Organization WH. . WHO Coronavirus (COVID-19) Dashboard [Web Page]. 2023. Accessed May 8, 2023 at: https://covid19.who.int/
  • 4 Yigitbasioglu OM, Velcu O. A review of dashboards in performance management: implications for design and research. Int J Account Inf Syst 2012; 13 (01) 41-59
  • 5 Lopez KD, Wilkie DJ, Yao Y. et al. Nurses' numeracy and graphical literacy: informing studies of clinical decision support interfaces. J Nurs Care Qual 2016; 31 (02) 124-130
  • 6 Sedrakyan G, Mannens E, Verbert K. Guiding the choice of learning dashboard visualizations: linking dashboard design and data visualization concepts'. J Comput Lang 2019; 50: 19-38
  • 7 Lee D, Panicker V, Gross C, Zhang J, Landis-Lewis Z. What was visualized? A method for describing content of performance summary displays in feedback interventions. BMC Med Res Methodol 2020; 20 (01) 90
  • 8 Camacho J, Zanoletti-Mannello M, Landis-Lewis Z, Kane-Gill SL, Boyce RD. A conceptual framework to study the implementation of clinical decision support systems (BEAR): literature review and concept mapping. J Med Internet Res 2020; 22 (08) e18388
  • 9 Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for health information systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inform 2008; 77 (06) 386-398
  • 10 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27 (03) 425-478
  • 11 Hysong SJ, O'Mahen P, Profit J, Petersen LA. Purpose, subject, and consumer: Comment on “Perceived burden due to registrations for quality monitoring and improvement in hospitals: a mixed methods study”. Int J Health Policy Manag 2022; 11 (04) 539-543
  • 12 Hysong SJ, Kell HJ, Petersen LA, Campbell BA, Trautner BW. Theory-based and evidence-based design of audit and feedback programmes: examples from two clinical intervention studies. BMJ Qual Saf 2017; 26 (04) 323-334
  • 13 Agency for Healthcare Research and Quality. . Data Visualization Best Practices for Primary Care Quality Improvement (QI) Dashboards. Rockville, MD: Author; 2018. Accessed May 8, 2023 at: https://www.ahrq.gov/evidencenow/tools/dashboard-best-practice.html
  • 14 Kaboli PJ, Miake-Lye IM, Ruser C. et al. Sequelae of an evidence-based approach to management for access to care in the Veterans Health Administration. Med Care 2019; 57 (03) S213-S20
  • 15 Foster M, Albanese C, Chen Q. et al. Heart failure dashboard design and validation to improve care of veterans. Appl Clin Inform 2020; 11 (01) 153-159
  • 16 Simpao AF, Ahumada LM, Larru Martinez B. et al. Design and implementation of a visual analytics electronic antibiogram within an electronic health record system at a tertiary pediatric hospital. Appl Clin Inform 2018; 9 (01) 37-45
  • 17 Hester G, Lang T, Madsen L, Tambyraja R, Zenker P. Timely data for targeted quality improvement interventions: use of a visual analytics dashboard for bronchiolitis. Appl Clin Inform 2019; 10 (01) 168-174
  • 18 Froese M-E, Tory M. Lessons learned from designing visualization dashboards. IEEE Comput Graph Appl 2016; 36 (02) 83-89
  • 19 Office of Quality and Performance. . FY 2006 Technical Manual for the VHA Performance Measurement System including JCAHO Hospital Core Measures. Veterans Health Administration [Intranet]. 2006. Accessed September 13, 2006 at: http://vaww.oqp.med.va.gov/oqp_services/performance_measurement/uploads/web_performance_measures/2006_perf_meas/FY06%20Tech%20Manual%20Q4%206-06.doc
  • 20 Office of Analytics and Performance Integration. Electronic Technical Manual for the VHA Performance Measurement System. Veterans Health Administration [Intranet]. 2023. Accessed May 8, 2023 at: https://pm.rtp.med.va.gov/ReportServer/Pages/ReportViewer.aspx?/Performance%20Reports/Measure%20Management/MeasureCatalog
  • 21 Petersen LA. Improving the Measurement of VA Facility Performance to Foster a Learning Healthcare System. US Department of Veterans Affairs Health Services Research & Development Service; 2015. Accessed May 8, 2023 at: https://www.hsrd.research.va.gov/research/abstracts.cfm?Project_ID=2141705745
  • 22 Wong JJ, SoRelle RP, Yang C. et al. Nurse leader perceptions of data in the Veterans Health Administration: a qualitative evaluation. Comput Inform Nurs 2023; DOI: 10.1097/CIN.0000000000001003.
  • 23 Chipman SF, Schraagen JM, Shalin VL. Introduction to cognitive task analysis. In: Schraagen JM, Chipman SF, Shalin VL. eds. Cognitive Task Analysis. Mahwah, NJ: Lawrence Erlbaum Associates; 2000: 3-23
  • 24 Gale RC, Wu J, Erhardt T. et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci 2019; 14 (01) 11