Appl Clin Inform 2024; 15(04): 727-732
DOI: 10.1055/a-2345-6475
CIC 2023

A Discount Approach to Reducing Nursing Alert Burden

Authors

  • Sarah A. Thompson

    1   Information Services and Technology, Children's Healthcare of Atlanta, Atlanta, Georgia, United States
  • Swaminathan Kandaswamy

    2   Department of Pediatrics, Emory University School of Medicine, Atlanta, Georgia, United States
  • Evan Orenstein

    1   Information Services and Technology, Children's Healthcare of Atlanta, Atlanta, Georgia, United States
    2   Department of Pediatrics, Emory University School of Medicine, Atlanta, Georgia, United States
 

Abstract

Background Numerous programs have arisen to address interruptive clinical decision support (CDS) with the goals of reducing alert burden and alert fatigue. These programs often have standing committees with broad stakeholder representation, significant governance efforts, and substantial analyst hours to achieve reductions in alert burden which can be difficult for hospital systems to replicate.

Objective This study aimed to reduce nursing alert burden with a primary nurse informaticist and small support team through a quality-improvement approach focusing on high-volume alerts.

Methods Target alerts were identified from the period of January 2022 to April 2022 and four of the highest firing alerts were chosen initially, which accounted for 43% of all interruptive nursing alerts and an estimated 86 hours per month of time across all nurses occupied resolving these alerts per month. Work was done concurrently for each alert with design changes based on the Five Rights of CDS and following a quality-improvement framework. Priority for work was based on operational engagement for design review and approval. Once initial design changes were approved, alerts were taken for in situ usability testing and additional changes were made as needed. Final designs were presented to stakeholders for approval prior to implementation.

Results The total number of interruptive nursing alert firings decreased by 58% from preintervention period (1 January 2022–30 June 2022) to postintervention period (July 1, 2022–December 31, 2022). Action taken on alerts increased from 8.1 to 17.3%. The estimated time spent resolving interruptive alerts summed across all nurses in the system decreased from 197 hours/month to 114 hours/month.

Conclusion While CDS may improve use of evidence-based practices, implementation without a clear framework for evaluation and monitoring often results in alert burden and fatigue without clear benefits. An alert burden reduction effort spearheaded by a single empowered nurse informaticist efficiently reduced nursing alert burden substantially.


Background and Significance

Computerized clinical decision support (CDS) in the electronic health record (EHR) has allowed for real-time support to help clinicians adhere to evidence-based practice, legal and regulatory requirements, and institutional polices.[1] [2] [3] [4] While CDS has been shown to improve quality of care and clinical outcomes, one of the most significant unintended consequences has been the increase in alert burden leading to alert fatigue.[5] [6] [7] [8] The phenomenon of excessive nuisance alerts was originally identified related to clinical monitoring but has been shown to have similar effect in the EHR leading to desensitization and ultimately reduced response.[6] [7] [9] [10] [11] This reduced user response to appropriate alerts can lead to safety events despite adequate warning from the EHR. Many alerts are built in response to a request to support evidence-based practice or prevent harm event but do not have an evaluation framework or ongoing plan to measure effectiveness of the intervention. In response to the perceived ineffectiveness, numerous programs have arisen to reduce alert burden to reduce alert fatigue and better target interruptive CDS.[5] [12] [13] However, these programs often have required standing committees with broad stakeholder representation, significant governance efforts, and substantial analyst hours to achieve reductions in alert burden. Many health systems do not have the bandwidth to replicate these programs. In addition, it can be challenging to identify potential nuisance alerts or alerts in need of optimization as programs are waiting for requestors to reach out about issues.

A significant nursing alert burden was identified in the system. This burden came from having a mature EHR system, lack of CDS team when the EHR was first implemented, no review framework or “update” schedule for existing alerts, reactive feedback-based response to alert performance, and many custom alerts. This project aimed to reduce nursing alert burden through a quality improvement approach focusing on high volume alerts. Given local resource constraints, stakeholders were limited to a small team to improve alerts with minimal impact to ongoing programs and informatics operational work. The team composed of a nursing informaticist, a physician informaticist, and a human factors engineer. The nursing informaticist was tasked with identification of target alerts and planning alert reduction. The physician informaticist helped to provide support and feedback on the alert process. The human factors engineer supported user-centered design and usability for the alerts.


Methods

Identification of Target Alerts

Target alerts were identified using the embedded EHR Tool (Epic SlicerDicer) that allows for identification of line level data regarding alert details and created a Pareto chart of our highest firing alerts with a focus on potential quick wins to bring to successive stakeholders. For example, one alert aiming to improve situational awareness of cardiac shunts fired interruptively approximately 12,000/month. By combining noninterruptive options, alert firing was reduced by 99% without loss of situational awareness.[14] We then proceeded to initially choose 4 of the top 10 highest firing alerts with relatively clear improvement opportunities. These top four alerts accounted for 43% of all interruptive nursing alerts and an estimated 86 hours of nonproductive nursing time per month.[15] Two additional alerts were identified as candidates during the working period and have been included in the [Supplementary Material] (available in the online version).


Prioritization and Process for Design Change

The goal was to work on the alerts concurrently with additional priority based on operational owner availability and engagement. For each alert burden reduction proposal, we developed a basic evaluation framework with the goal of reduced alert firings without worsening metrics intended to be improved with the alert (if available). The minimal metrics for each alert were total alert firing and, if applicable, user action rate. The framework started with a comparison of the existing alert against the Five Rights of CDS[16] followed by a plan using the Institute for Healthcare Improvement (IHI) Model for Improvement[17] for ongoing monitoring of alert performance. Alert design updates followed the principles of user-centered design for the initial iteration. Each proposal was reviewed in existing CDS committees including physician and nursing clinical informaticists, pharmacy informatics, and information services and technology personnel. The lead nurse informaticist would then bring vetted proposals to operational alert owners and iteratively update the design based on feedback with those groups. Once initial design update was complete, the updated alert went through in situ usability testing with the target staff to ensure alert performed as intended. For in situ usability testing,[18] a patient was created in a test version of the EHR where the proposed CDS would fire. The lead nurse informaticist and human factors engineer would then go to the clinical sites where the proposed users might see the alert (e.g., nurses on the general care floor or in the intensive care unit) and identify a prospective user who was willing to devote 5 to 10 minutes to assist with EHR design. A clinical scenario was described to the user and they were asked to “think aloud”[19] as they completed the relevant clinical task. After the CDS displayed and users took or did not take the intended action, the users were interviewed to understand any updates in the design that might have led them to the intended action and updates were made between participants. After completing in situ usability testing to the point that users completed the intended action or no low-effort design updates could be identified, the final designs were then re-shared with the original requesting stakeholders ([Fig. 1]).

Zoom
Fig. 1 Initial design created by group, taken out for in situ usability testing with iterative design updates until no further changes identified.

Evaluation of Intervention

We visualized the total number of alert firings and alert acceptance rate for nurses graphically over time. We compared the total number of interruptive alert firings seen by nurses in the 6-month period before (January 1, 2022–June 30, 2022) and after (July 1, 2022–December 31, 2022) implementation of the major alert burden reduction efforts. We also calculated the estimated hours spent by nurses resolving interruptive alerts assuming 8 seconds per alert.[15] To ensure that alert burden reductions were not associated with detrimental patient care, we also compared the primary metric for three of the alerts for which outcome data were available before and after alert burden reduction efforts (see [Supplementary Appendix], available in the online version) as balance metrics. Comparisons for alert firings, hours spent on interruptive alerts, and outcome rates were done using Student's t-test. Alert action taken rates were compared using Χ 2 tests.


Ethical Considerations

These efforts were conducted as nonhuman subjects research for quality improvement for alert burden reduction across the health system.



Results

The alert reduction work started in February 2022 with the major alert updates occurring in July of 2022. From January 1, 2022, through June 30, 2022, our nurses saw 532,265 interruptive alerts (88,710 per month, estimated 197 total nursing hours per month) and acted 7.9% of the time. In the postintervention period, from July 2022 to December 2022, interruptive nursing alert burden was reduced to 308,846 (51,474 per month, estimated 114 total nursing hours per month) with action being taken 16.0% of the time (p < 0.001 for differences in alerts per month and proportion of time action was taken). For 2023, this trend continued ([Figs. 2] and [3]) with the next 6 months (January 1, 2023–June 30, 2023) having 222,959 interruptive alerts shown to nurses (37,160 per month, estimated 83 total nursing hours per month).

Zoom
Fig. 2 Total interruptive nursing alert firings from January 2022 to June 2023.
Zoom
Fig. 3 Percent of interruptive nursing alerts with action taken by the user from January 2022 to June 2023.

To achieve these reductions, we estimate over a 3-month period that the resources required included 50 hours of work from the primary nurse informaticist, 5 hours to review proposals in our CDS committee, 10 hours to pull data for balancing metrics (two out of three existed already prior to this intervention), and 9 hours in meetings with operational alert owners to provide feedback and approve alert reduction proposals for the July alert updates.


Discussion

This quality-improvement project significantly reduced interruptive nursing alert burden using a single nurse clinical informaticist owner empowered to reduce nursing alerts with guidance from a CDS committee and operational alert owners. This small team approach with a single nurse informaticist lead reduced the need for as many stakeholder meetings, simplified coordination of the projected, and ensured clear ownership of build changes throughout the process. This approach is novel compared to published examples of alert burden reduction strategies,[5] [12] [13] requires fewer resources, and therefore may be more easily adoptable across health systems with fewer informatics-dedicated resources. For this quality-improvement project, using a nurse informaticist with extensive experience to specifically target nursing alerts helped understand the sociotechnical system where the alerts fired. It also helped reduce barriers to collect feedback on alerts, connect with existing nurse leadership structures, and engage with frontline clinicians. Using a standard process for alert review increased efficiency of the review process by providing a clear road map for how to move forward with each alert. It also allowed for a comparison so that the alerts could be updated consistently. The smaller group allowed for easier pivoting as alerts were removed and added into the process.

The most common adjustments to high-firing interruptive alerts included language updates and timing adjustments ([Table 1]). Many EHR vendors have increased the options for customization of alerts which allows for both clearer instructions and better targeting to more specific hooks in the workflow. For example, alerts that fire upon chart opening are often conveying information that is not useful to the user for the task they are trying to complete or is too early and the information is forgotten by the time the user is completing the relevant task. Newer triggers provided more noninterruptive options or more targeted options to the exact right time in the workflow when a nudge may be more effective. Similarly, many of these high-firing alerts contained large amounts of text, often resulting from “design by committee” where requestors wanted to ensure that all relevant information was transmitted. By contrast, our small team focused on direct communication where if the user read the first couple of words of each line, they would have a sense of what to do. Standardizing the language has also helped users know where to process the “important” information. Many of the alert changes focused on content of the alert such as standardizing terminology, guidance to providers on next steps, consistency in alert representation, and the ability to resolve the alert directly.[20]

Table 1

Strategies to mitigate common ineffective alert design

Ineffective alert design

Mitigation strategy

Wrong timing in the workflow

 ● Often used “Open Chart” trigger

Adjusted trigger to alert during expected time of addressing workflow

 ● Manage orders

 ● File flowsheet row

Too many words to read

Reduced text whenever possible

Focused on how to take an action

 ● Click “Accept” to…

Lack of clarity on reason for alert appearance

Uses consistent phrasing framework:

“Why am I seeing this alert?”

“What should I do?”

Inability to act from the alert

Added documentation and ordering actions within BPA whenever possible or linked to activity where action should be taken

In addition to using CDS design principles and leveraging new EHR capabilities for better targeting, in situ usability testing was also essential in helping update alert designs and implement more effective approaches. Despite Five Rights and heuristic reviews by an expert team, nonetheless most alert designs benefitted from additional updates after usability testing was performed with prospective users. Usability testing data also helped us to demonstrate to the original alert requestors when an interruptive alert was unlikely to affect user behavior, guiding them to alternative solutions such as workflow adjustments.


Limitations

This alert burden reduction effort was undertaken within a single health system with its own history of EHR implementation, alert development and maintenance, and culture around EHR changes. The focus on high-firing alerts with a small team engaging in usability testing may not generalize to organizations without easily adjustable high-firing alerts or where a usability testing is not culturally accepted. Similarly, organizations with larger informatics groups and more resources may have greater benefit from a more comprehensive approach involving more stakeholders.[12] [13] Additionally, the goal of this project was to reduce total nursing alert burden and increase action taken from the alerts without compromising the original intent of the alert's purpose. While it is relatively simple to measure alert performance in terms of total firings and action taken, it was not possible accurately to measure all the patient outcomes related to each alert. This approach was limited to nurse-facing alerts to reduce stakeholder needs; it may not generalize to provider-facing alerts or may require a different team including more stakeholders with understanding of different provider workflows.


Conclusion

While CDS can improve the use of evidence-based practices, implementation without a clear framework for evaluation and ongoing monitoring often results in alert burden and fatigue without clear benefits. An alert burden reduction effort spearheaded by a single empowered nurse clinical informaticist embedded in a CDS governance structure and with in situ usability testing efficiently reduced nursing alert burden substantially. Strategies that facilitated this work included utilizing a nurse informaticist with a strong clinical background and institutional relationships, focusing on high-firing low action alerts, systematic evaluation using the Five Rights of CDS, in situ usability testing of candidate designs, and the IHI Model for Improvement for evaluation of alert rationalization effectiveness.


Clinical Relevance Statement

Interruptive alerts can improve use of evidence-based practices, but poorly built alerts contribute to alarm fatigue and desensitization. Optimizing alerts to reduce alert firing can be done with a small team with significant impact. The quality improvement approach described here can be used by institutions interested in reducing alert burden but that lacked the resources or bandwidth for large interdisciplinary clinical informatics teams to address this issue.


Multiple-Choice Questions

  1. What is the most effective data visualization to identify high-firing alerts that may benefit from rationalization?

    • Comparison chart

    • Pareto chart

    • Histogram

    • Control chart

    Correct Answer: The correct answer is option b. A Pareto chart allows for easy visualization and identification of culprit alerts that are contributing to overall alert burden.

  2. What constitutes a minimal set of metrics to monitor alert performance?

    • Total alert firings

    • Percent of firings where the user takes an action

    • User feedback with focus on “cranky comments”

    • Outcome metrics (i.e. the metric the alert is intended to improve)

    • a, b, and c

    • All the above

    Correct Answer: The correct answer is option e. When reviewing alert performance, it is important to consider both how often the alert fires and how often users act. Even if an alert is low firing, if no user acts, then it may benefit from review and redesign. User feedback often helps to identify the type of alerts that are most frustrating to clinicians and works as intended. This helps to ensure alerts are interpreted by users as intended. All of these measures can be applied across alerts with few resources. Outcome metrics (i.e., the metric the alert was intended to move) are often the most important, but these generally require the most resources to build and must be customized for each alert. Alert rationalization efforts can generally make progress even if the outcome metric is not automated and easily captured.



Conflict of Interest

S.A.T. has nothing to disclose. S.K. has nothing to disclose. E.O. is a co-founder and holds equity in Phrase Health, a clinical decision support analytics company. He has also served as Principal Investigator on Phase 1 and Phase 2 STTR grants with Phrase Health from the National Library of Medicine (NLM) and National Center for the Advancement of Translational Science (NCATS). He receives no direct revenue from Phrase Health but has received salary support from NLM and NCATS.

Protection of Human and Animal Subjects

This work was felt to be primarily focused on quality improvement and therefore deemed nonhuman subjects research by the Institutional Review Board of Children's Healthcare of Atlanta.



Address for correspondence

Sarah A. Thompson, MSHMI, BSN, RN
Information Services and Technology
Children's Healthcare of Atlanta - Support Building 1575 Northeast Expressway, Atlanta, GA 30329
United States   

Publication History

Received: 03 November 2023

Accepted: 12 June 2024

Accepted Manuscript online:
14 June 2024

Article published online:
04 September 2024

© 2024. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany


Zoom
Fig. 1 Initial design created by group, taken out for in situ usability testing with iterative design updates until no further changes identified.
Zoom
Fig. 2 Total interruptive nursing alert firings from January 2022 to June 2023.
Zoom
Fig. 3 Percent of interruptive nursing alerts with action taken by the user from January 2022 to June 2023.