CC BY-NC-ND 4.0 · ACI open 2022; 06(02): e94-e97
DOI: 10.1055/s-0042-1757156
Case Report

The Impact of an Organization-Wide Electronic Health Record (EHR) System Upgrade on Physicians' Daily EHR Activity Time: An EHR Log Data Study

Lori Wong
1   Department of Biomedical Informatics, University of Arkansas for Medical Sciences, Little Rock, Arkansas, United States
,
Kevin W. Sexton
1   Department of Biomedical Informatics, University of Arkansas for Medical Sciences, Little Rock, Arkansas, United States
2   Department of Surgery, UAMS College of Medicine, Little Rock, Arkansas, United States
3   Department of Health Policy and Management, Fay W. Boozman College of Public Health, Little Rock, Arkansas, United States
,
Joseph A. Sanford
1   Department of Biomedical Informatics, University of Arkansas for Medical Sciences, Little Rock, Arkansas, United States
5   UAMS Institute for Digital Health & Innovation, Little Rock, Arkansas, United States
6   Department of Anesthesiology, UAMS College of Medicine, Little Rock, Arkansas, United States
› Author Affiliations
Funding Research reported in this publication was supported by the National Center For Advancing Translational Sciences of the National Institutes of Health under award numbers UL1 TR003107, KL2 TR003108, TL1 TR003109, and R01 GM111324. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
 

Abstract

Objective This article assesses the impact of a health care organization's electronic health record (EHR) upgrade on providers' daily EHR activity time.

Methods Daily EHR activity time (minutes/day) was acquired through EHR log data that automatically tracks user activity. Subjects were attending and resident physicians in the departments of family medicine, hospitalist medicine, and the neonatal intensive care unit working in the inpatient setting. The EHR upgrade occurred in August 2020, and the comparison groups were pre-upgrade (May 31, 2020–July 25, 2020) and post-upgrade (August 30, 2020–October 31, 2020). A two-tailed, two-sample t-test was used to assess statistical significance.

Results The pre-upgrade group had 146 users, and the post-upgrade group had 140 users. There was no statistically significant difference between the pre-upgrade group (mean (M): 104.74 minutes/day, standard deviation [SD]: 70.64) and post-upgrade group (M: 103.38 minutes/day, SD: 64.77), even after splitting the data by user type and user type and department.

Conclusion This study showed no significant difference in daily EHR activity time post-upgrade. More research is needed to truly understand the impact of EHR upgrades on user efficiency. Understanding the content of each upgrade might be key in understanding their effect on users, and we hope to explore that in the future.


#

Introduction

After the passing of the Health Information Technology for Economic and Clinical Health Act in 2009, electronic health record (EHR) adoption and use rapidly increased with around 86% of office-based physicians and 96% of nonfederal hospitals using an EHR system as of 2017.[1] [2] While the purpose of EHR adoption was to improve health care quality, efficiency, and health outcomes, EHR use has been associated with provider burnout and decreased satisfaction.[3] This is significant because around half of all clinician time is spent in the EHR.[4] [5] From October 12, 2017 to March 15, 2018, Melnick et al assessed the usability of the EHR system among U.S. physicians with the system usability scale (SUS), an industry standard used to measure usability. The SUS score for EHR systems was 46/100 in comparison to other industries, including every day products, or the bottom 9%, with the industry average across studies being 68/100.[6]

Additionally, EHRs require maintenance and a continuous upgrade cycle with a frequency that can vary from quarterly to yearly to maintain optimal performance. Upgrades consist of regulatory, system maintenance, and improved user functionality items. The upgrade cycles start with senior-level nursing, clinical, and pharmacy informaticists and EHR analysts reviewing upgrade notes by the EHR vendor and deciding which ones will be part of the upcoming quarterly upgrade. Build tasks are then created and distributed to different teams with specific focus areas. The new builds then undergo testing and lastly training. The upgrade cycle may vary between different health care systems and EHR vendors.

While upgrades may ultimately improve security, stability, and interoperability, upgrades can also lead to disruptions through downtimes and altering workflows, which can lead to user dissatisfaction and decreased efficiency. To gain insight into where users are spending time in the EHR within the inpatient setting, Epic Systems Inc (Verona, Wisconsin, United States) created the Inpatient Provider Efficiency Profile (IP PEP) tool that can be used to measure time spent in minutes on different activities within the EHR, frequency of different activities, and patient volume in the inpatient setting utilizing automatically captured log data.[7] IP PEP can also be used to measure total activity time (minutes) per day for each provider to better understand the amount of time spent in the EHR system.


#

Objectives

While studies have assessed the usability of different EHR systems or transitioning from a paper-based system to EHR, there is a knowledge gap on the effect of system-wide upgrades on the time a user spends working in the EHR in the inpatient setting.[8] [9] [10] This study aims to quantify this impact in time spent in the EHR before and after a system-wide quarterly upgrade.


#

Methods

This study took place at a tertiary academic medical center in the Southern U.S. The quarterly upgrade being analyzed occurred in August 2020. As part of the upgrade process, the EHR vendor categorized each upgrade item by training difficulty, ranging from none to considerable. The changes that impact physicians were identified by reviewing the Inpatient Provider Training tip sheets created for the August upgrade. Any changes substantial enough to impact providers would need to be communicated to them.

The IP PEP tool extracts log data from the Epic EHR system as monthly spreadsheets. The raw data was then extracted from the spreadsheets, and the total EHR activity time (minutes) per day was calculated for each user. Data from two workbooks were used as the two groups representing pre-update (May 31, 2020–July 25, 2020) and post-update (August 30, 2020–October 31, 2020). The data was aggregated in 4-week intervals, and the data between July 26, 2020 and August 29, 2020 was excluded due to inability to further separate the data based on the quarterly EHR upgrade Go-Live data of August 7, 2020. Total EHR activity time per day was used as a way of measuring user efficiency with the premise that if an upgrade improved user usability, then it would be reflected as a decrease in time in the EHR and vice versa.

The dependent variable was total activity time (minutes) per day, and the independent variable was pre- or post-update time periods. Data was further categorized by user type (attending or resident physician), then by user type and department. Departments chosen consisted of inpatient service lines and had the largest number of total attending and resident physicians, internal medicine, family medicine, and the neonatal intensive care unit. Intensive care setting was excluded from the internal medicine and family medicine groups. Any department that contained “clinic” in its name, services considered outpatient/ambulatory, emergency medicine, inpatient psychiatry, anesthesiology, operation rooms, laboratory, and radiology were excluded from the analysis.

The descriptive analysis consisted of sample size, mean, and standard deviation (SD) of activity time pre- and post-update for each department and user type. A two-sample t-test was selected to assess statistical significance because the number of participants on inpatient service is consistent between months, while the individual participants changed regularly, especially the residents. Statistical analysis was performed in JMP 15.1 (SAS Institute, Inc., Cary, North Carolina, United States, 1989–2021), significance was set at 0.05.


#

Results

For the August 2020 upgrade, 1,251 out of 1,301 upgrade notes were reviewed, and 828 were implemented as part of the upgrade. The Inpatient Provider Training tip sheet contained 8 general updates and 15 physician updates. The upgrade being analyzed was a regular quarterly upgrade with a mix of changes, including improving security, staying up to date with regulations, updating quality measures, added functionalities for ease of access, and changing layouts for improved readability. Even though there were changes to the user interface, none were major changes that needed substantial training based on the EHR vendor's recommendations

For the pre-upgrade group, there were 146 users, 42.5% were attendings and 57.5% were residents. The mean daily activity time was 104.74 minutes/day (SD: 70.64). The post-upgrade group had 140 users with 40.0% attendings and 60.0% residents. The mean daily activity time was 103.38 minutes/day (SD: 64.77). The difference in the proportion of attending and resident physicians pre- and post-upgrade was not statistically significant.

The difference between the total activity time (minutes)/day within the pre-update group and the post-upgrade group was not statistically significant (p = 0.866). In addition, the difference in daily activity time was not statistically significant when the pre- and post-groups were categorized by user type (physician or resident) or by user type and department ([Table 1]).

Table 1

Daily EHR activity time (minutes per day) by user type and department pre- and post-upgrade

Physicians

Department

Sample size (n)

Mean

SD

Mean

SD

Mean difference

p-Value

Pre

Post

Pre

Post

Post–Pre

Family medicine

12

8

95.78

46.01

77.24

44.59

–18.54

0.3822

Internal medicine

23

21

130.4

61.03

134.09

56.12

3.69

0.8355

Neonatal intensive care unite

27

27

63.55

35.19

68.39

27.98

4.84

0.579

Physicians - total

62

56

94.59

56.24

94.29

52.39

–0.30

0.9763

Residents

Department

Sample size ( n )

Mean

SD

Mean

SD

Mean difference

p -Value

Pre

Post

Pre

Post

Post–Pre

Family medicine

22

18

133.05

69.09

142.85

43.45

9.8

0.5881

Internal medicine

16

16

109.6

49.87

130.2

55.24

20.6

0.2771

Neonatal intensive care unit

46

50

103.18

90.51

90.77

78.53

–12.41

0.4766

Residents - total

84

84

112.23

79.12

109.44

71.51

–2.79

0.8112

Abbreviations: EHR, electronic health record; SD, standard deviation.



#

Discussion

Data are lacking on the impact of generalized, system-wide EHR upgrades on time spent by users in the EHR system. Outside of the health care setting, a 2017 field study by Vitale et al consisting of 14 participants in 2015 who completed 4-week diaries after upgrading an operating system of their choice found that participants experienced twice as many negative changes compared with positive ones, and 6 participants plan on either delaying, avoiding, or finding a way to skip the long wait time for future upgrades.[11] In other studies analyzing time spent in the EHR system in the inpatient setting, physicians spent an average of approximately 4 to 5 hours/workday. The average total EHR activity time per day in this study was less with an average of 1 hour and 45 minutes.[5] [12]

Targeted approaches have been effective in reducing EHR time and cognitive workload.[13] [14] For instance, Mazur et al using simulated EHR environments, enrolled 38 participants and blindly assigned them to a baseline EHR and an enhanced EHR and found the baseline group had higher cognitive workloads and poorer performance managing abnormal laboratory results.[13] The enhanced EHR was designed and tested by a physician, human factors engineer, and EHR software developer and included functionality and usability testing.[13] Semanik et al assessed the use and impact of a problem-oriented view (POV), a display view aggregating relevant data for each clinical problem, on completion time, error rate, usability, and workload with 51 internal medicine residents from three academic medical centers. Semanik et al found that all four metrics improved with the use of the POV, in comparison to the standard view (control).[14]

Targeted interventions that consider the health system's work culture, workflows, and direct feedback might improve usability. But the lack of change in daily EHR time might also be due to the user interface not changing drastically during this EHR upgrade. A 2011 paper by Khoo et al looked at end-users experiencing an organization-wide upgrade in Windows and SAP, a form of enterprise application software, drastic user interface changes that occurred with SAP took longer for users to adapt to than minimal user interface changes with Windows.[15] However, the impact of system-wide EHR upgrades on usability is not well studied and can benefit from more published research studies in this area.

This study had multiple limitations. It used data from a single tertiary care hospital in a rural, Southern U.S. state, only analyzed the effect of a single EHR upgrade, and took place during the coronavirus disease 2019 pandemic creating limitations in generalizability to other EHR implementations. In addition, the study looks at 8 weeks prior to the upgrade and 8 weeks after the upgrade, which was selected to examine the risk of setback post-upgrade and may be insufficient to show significant positive change. Daily EHR time over time using a time series study would be the next step to assess for change. The period from July 26, 2020 to August 29, 2020 was excluded in the analysis due to the limitations in granularity of the data set, which misses the immediate impact of the upgrade. So, we may have missed the regression back to the mean. In addition, certain EHR upgrades might have more impact than others. This specific upgrade might not have impacted daily EHR user activity and the focus of regular, quarterly EHR upgrades are not entirely on improving usability, which may mean usability is unlikely to change, or, for that matter, lead to disruption of existing workflows. Due to the retrospective nature of the study, interviews and validated surveys, such as the SUS or NASA-TLX, could not be used to assess the providers on the EHR's usability before and after the upgrade, as well as physician characteristics, such as year in practice and years using the EHR system. Confounding factors, such as patient load and physicians in both or only one group, were also not accounted for during this study. However, a strength is that this study was not subjected to certain types of bias, such as observer bias or recall bias, because it is based on log data. Our future work in this area includes evaluating changes in user efficiency over the course of different EHR upgrades, analysis accounting for patient load and users in both groups and users only in pre- or post-upgrade group, and focused upgrade items on different clinical activities, such as notes, orders, and chart review, and in different clinical settings. Other considerations for further work in EHR upgrades would be to perform a segmented time series study to see overall trends before and after an upgrade.


#

Conclusion

This study showed that EHR upgrades might not be sufficient to significantly change user efficiency in the EHR. On the other hand, targeted interventions have shown some potential in decreasing EHR time and improving usability. However, more research is needed to better understand the impact of EHR upgrades on users, especially user efficiency and experience.


#

Clinical Relevance Statement

In 2017, EHR adoption in the U.S. was 86% among office-based physicians and 96% among nonfederal acute care hospitals.[1] [2] More research and understanding on the effect of EHR upgrades on users, including user efficiency, is important because EHR use in the U.S. is widespread and all EHRs undergo regular upgrade cycles that can impact multiple systems and users across the country.


#
#

Conflict of Interest

None declared.

Protection of Human and Animal Subjects

No human subjects were involved in the project.


  • References

  • 1 Office of the National Coordinator of Health for Health Information Technology (ONC). Office-based Physician Electronic Health Record Adoption. Heal. IT Quick Stat #50. 2019. Accessed February 25, 2020 at: https://dashboard.healthit.gov/quickstats/pages/physician-ehr-adoption-trends.php
  • 2 Office of the National Coordinator of Health for Health Information Technology (ONC). Non-federal Acute Care Hospital Electronic Health Record Adoption. Heal. IT Quick-Stat #47. 2017. Accessed April 2, 2021 at: https://dashboard.healthit.gov/quickstats/pages/FIG-Hospital-EHR-Adoption.php
  • 3 Shanafelt TD, Dyrbye LN, Sinsky C. et al. Relationship between clerical burden and characteristics of the electronic environment with physician burnout and professional satisfaction. Mayo Clin Proc 2016; 91 (07) 836-848
  • 4 Tai-Seale M, Olson CW, Li J. et al. Electronic health record logs indicate that physicians split time evenly between seeing patients and desktop medicine. Health Aff (Millwood) 2017; 36 (04) 655-662
  • 5 Wang JK, Ouyang D, Hom J, Chi J, Chen JH. Characterizing electronic health record usage patterns of inpatient medicine residents using event log data. PLoS One 2019; 14 (02) e0205379
  • 6 Melnick ER, Dyrbye LN, Sinsky CA. et al. The association between perceived electronic health record usability and professional burnout among US physicians. Mayo Clin Proc 2020; 95 (03) 476-487
  • 7 Kadish SS, Mayer EL, Jackman DM, Jackman DM. et al. Implementation to optimization: A tailored, data-driven approach to improve provider efficiency and confidence in use of the electronic medical record. J Oncol Pract 2018; 14 (07) e421-e428
  • 8 Carayon P, Wetterneck TB, Alyousef B. et al. Impact of electronic health record technology on the work and workflow of physicians in the intensive care unit. Int J Med Inform 2015; 84 (08) 578-594
  • 9 Hanauer DA, Branford GL, Greenberg G. et al. Two-year longitudinal assessment of physicians' perceptions after replacement of a longstanding homegrown electronic health record: does a J-curve of satisfaction really exist?. J Am Med Inform Assoc 2017; 24 (e1): e157-e165
  • 10 Ehrlich JR, Michelotti M, Blachley TS. et al. A two-year longitudinal assessment of ophthalmologists' perceptions after implementing an electronic health record system. Appl Clin Inform 2016; 7 (04) 930-945
  • 11 Vitale F, McGrenere J, Tabard A. et al. High costs and small benefits: a field study of how users experience operating system upgrades. Conf Hum Factors Comput Syst Proc 2017; x: 4242-4253
  • 12 Verma G, Ivanov A, Benn F. et al. Analyses of electronic health records utilization in a large community hospital. PLoS One 2020; 15 (07) e0233004
  • 13 Mazur LM, Mosaly PR, Moore C, Marks L. Association of the usability of electronic health records with cognitive workload and performance levels among physicians. JAMA Netw Open 2019; 2 (04) e191709
  • 14 Semanik MG, Kleinschmidt PC, Wright A. et al. Impact of a problem-oriented view on clinical data retrieval. J Am Med Inform Assoc 2021; 28 (05) 899-906
  • 15 Khoo HM, Robey D, Rao SV. An exploratory study of the impacts of upgrading packaged software: a stakeholder perspective. J Inf Technol 2011; 26: 153-169

Address for correspondence

Kevin W. Sexton, MD
Department of Biomedical Informatics, College of Medicine, University of Arkansas for Medical Sciences
4301 West Markham, Slot 728, Little Rock, AR 72205
United States   
Email: kev@uams.edu

Publication History

Received: 06 July 2021

Accepted: 06 June 2022

Article published online:
12 October 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Office of the National Coordinator of Health for Health Information Technology (ONC). Office-based Physician Electronic Health Record Adoption. Heal. IT Quick Stat #50. 2019. Accessed February 25, 2020 at: https://dashboard.healthit.gov/quickstats/pages/physician-ehr-adoption-trends.php
  • 2 Office of the National Coordinator of Health for Health Information Technology (ONC). Non-federal Acute Care Hospital Electronic Health Record Adoption. Heal. IT Quick-Stat #47. 2017. Accessed April 2, 2021 at: https://dashboard.healthit.gov/quickstats/pages/FIG-Hospital-EHR-Adoption.php
  • 3 Shanafelt TD, Dyrbye LN, Sinsky C. et al. Relationship between clerical burden and characteristics of the electronic environment with physician burnout and professional satisfaction. Mayo Clin Proc 2016; 91 (07) 836-848
  • 4 Tai-Seale M, Olson CW, Li J. et al. Electronic health record logs indicate that physicians split time evenly between seeing patients and desktop medicine. Health Aff (Millwood) 2017; 36 (04) 655-662
  • 5 Wang JK, Ouyang D, Hom J, Chi J, Chen JH. Characterizing electronic health record usage patterns of inpatient medicine residents using event log data. PLoS One 2019; 14 (02) e0205379
  • 6 Melnick ER, Dyrbye LN, Sinsky CA. et al. The association between perceived electronic health record usability and professional burnout among US physicians. Mayo Clin Proc 2020; 95 (03) 476-487
  • 7 Kadish SS, Mayer EL, Jackman DM, Jackman DM. et al. Implementation to optimization: A tailored, data-driven approach to improve provider efficiency and confidence in use of the electronic medical record. J Oncol Pract 2018; 14 (07) e421-e428
  • 8 Carayon P, Wetterneck TB, Alyousef B. et al. Impact of electronic health record technology on the work and workflow of physicians in the intensive care unit. Int J Med Inform 2015; 84 (08) 578-594
  • 9 Hanauer DA, Branford GL, Greenberg G. et al. Two-year longitudinal assessment of physicians' perceptions after replacement of a longstanding homegrown electronic health record: does a J-curve of satisfaction really exist?. J Am Med Inform Assoc 2017; 24 (e1): e157-e165
  • 10 Ehrlich JR, Michelotti M, Blachley TS. et al. A two-year longitudinal assessment of ophthalmologists' perceptions after implementing an electronic health record system. Appl Clin Inform 2016; 7 (04) 930-945
  • 11 Vitale F, McGrenere J, Tabard A. et al. High costs and small benefits: a field study of how users experience operating system upgrades. Conf Hum Factors Comput Syst Proc 2017; x: 4242-4253
  • 12 Verma G, Ivanov A, Benn F. et al. Analyses of electronic health records utilization in a large community hospital. PLoS One 2020; 15 (07) e0233004
  • 13 Mazur LM, Mosaly PR, Moore C, Marks L. Association of the usability of electronic health records with cognitive workload and performance levels among physicians. JAMA Netw Open 2019; 2 (04) e191709
  • 14 Semanik MG, Kleinschmidt PC, Wright A. et al. Impact of a problem-oriented view on clinical data retrieval. J Am Med Inform Assoc 2021; 28 (05) 899-906
  • 15 Khoo HM, Robey D, Rao SV. An exploratory study of the impacts of upgrading packaged software: a stakeholder perspective. J Inf Technol 2011; 26: 153-169