Keywords
burnout - electronic medical record - electronic health record - training - optimization
Background and Significance
Background and Significance
The Health Information Technology for Economic and Clinical Health (HITECH) Act of
2009 promoted the adoption and meaningful use of health information technology (HIT).
Despite widespread electronic health record (EHR) adoption in the years that followed,
meaningful use was not fully realized. Instead, accelerated EHR implementations coincided
with increased clinician workloads and a national epidemic of physician burnout.[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10] EHR implementation priorities often focus on (1) regulation and compliance, (2)
billing and productivity, and (3) organizational growth and mergers. This incomplete
focus leads to audits and acquisitions instead of clinician engagement,[11] adequate workflow analysis,[12] or development of national usability standards.[13] Loss of autonomy, negative emotions, increased administrative burden, and changes
in workflow and communication were just a few of the unintended adverse consequences
(UAC) of EHR implementation.[14]
[15]
[16] It was recognized that “technically defined [EHR] implementation success [did] not
ensure maximum physician acceptance and use.”[17] Instead, EHR implementation needed to coexist with or be closely followed by EHR
optimization.[12]
[17]
[18]
[19]
[20]
Facilitators of EHR optimization include a clear vision, committed leaders and governance,
involved physician informaticists (PI), nurse informaticists (clinical informaticist
[CI]), dedicated resources, stakeholder engagement, workflow analysis, and ongoing
training.[12]
[17]
[18]
[19]
[20]
[21]
[22]
[23]
[24]
[25]
[26]
[27] Optimization, variably defined as “maintained attention to the sustained use of
the EHR”[17] or “the process that maximizes the benefits and utility of the EHR system,”[18] relies heavily on training and education. Many of these optimization training efforts
have been described[28]
[29]
[30]
[31]; however, evaluations of more comprehensive, bidirectional optimization programs
are sparse.[12]
Software configuration and “intensive process reengineering”[20] require iteration and considerable organizational support. The clinic-specific Sprint
EHR optimization process allows for real-time problem-solving and tool implementation
unlike the more traditional EHR development process, which can take months and lead
to frustrations when asynchronous communication, IT semantics, and IT organizational
structures do not match clinical needs. Traditionally, HIT software teams are organized
by EHR application (pharmacy, laboratory, mobile, billing) or EHR task (orders, clinical
decision support, letters). We elected to study the Sprint process to show the value
of viewing requests from the perspective of the clinician or staff member, evaluating
clinical efficiency gains to fully comprehend the end goal. We evaluate the multidisciplinary
Sprint team approach to more generalized and not module-specific governance that promotes
innovative, rapid solutions and reduces the EHR burden on the end user. We try to
reduce the “suffering in silence,” the unwillingness of clinicians and staff to report
EHR concerns, that can lead to burnout.
Adhering to key principles of EHR optimization, University of Colorado Health (UCHealth)
developed Sprint, a clinic-centered EHR optimization and training program in 2016.
Sprint overlies an existing framework for ongoing health system–wide EHR optimization
at UCHealth. The Sprint team delivers completed EHR build and intensive training during
brief, onsite interventions in target clinics. We work with ∼40 clinics, 600 clinicians,
and 600 staff members per year and, to date, we have conducted Sprint events in >110
clinics. We previously demonstrated increased clinician satisfaction, improved teamwork,
and decreased clinician burnout with Sprint intervention.[28] In this Sprint program evaluation, we describe and evaluate the EHR request, prioritization,
and software development process that is integral to Sprint and compliments our successful
training program.
Objectives
The objective of the study was to describe and demonstrate the work products of one
EHR optimization program that adheres to commonly recognized key principles for successful
EHR optimization.
Methods
Sprint Program Background
The UCHealth Sprint EHR optimization and training team is a high-performing, multidisciplinary
team comprised of a project manager (PM), a CI, a PI, and ambulatory-certified trainers
and EHR analysts who direct and participate in Sprint events. Whereas Sprints are
the most salient component of UCHealth's EHR optimization program, there is also an
existing framework for ongoing optimization related to projects, upgrades, and other
individual requests. UCHealth is a large, integrated health network, comprising 12
hospitals, >600 clinics, and >5,000 clinicians, who practice in a variety of settings:
academic and community, urban and rural, primary care, specialty care, and multispecialty.
All clinics utilize one version of the EPIC EHR (version 2020, EPIC Systems, Verona,
Wisconsin). The health system implemented EPIC in a rolling wave approach beginning
in 2011. Since that time, the organization has at least tripled in size.
Sprint events are 1 to 4 weeks in duration and timing is determined by the number
of the clinicians in the practice (20 clinicians = 1 week, 40 clinicians = 2 weeks,
etc.) although all staff and clinicians are targeted for training and optimization
during Sprint. Approximately every 3 years, clinics have the opportunity to participate
in a Sprint, and clinics are selected based on strong medical director and manager
leadership, clinician and staff desire for Sprint, and timing of last Sprint relative
to current ask. Sprint events are onsite, clinic focused, and facilitated by 1 PM,
1 CI, 1 PI, 4 ambulatory-certified trainers, and 4 ambulatory-certified EHR analysts.
Key components of the Sprint program include group training (Kick off training focused
on EHR personalization and Wrap up training focused on workflow), 1:1 training, and
EHR optimization to address inefficient and problematic clinical workflows. All clinical
staff and clinicians are included with the exception of students and residents who
are typically excluded due to lack of availability to participate.
The Sprint team employs eight, ambulatory-certified EPIC (Epic Systems) EHR analysts
who work in teams of four on each Sprint optimization event. Each of the Sprint analysts
also belong to more specific build groups (letters, in basket, orders) within the
health system ambulatory EHR team. The Sprint analysts investigate, clean up, repair,
and innovate EHR build with direct end-user and clinic-centered feedback. The Sprint
PM collects “break-fix” and “new build” requests from clinicians and staff starting
with pre-Sprint meetings and continuing throughout the Sprint ([Fig. 1]). Break-fix is defined as requests to change existing EHR tools that were either
incomplete or not functioning as expected. New build requests include implementation
of EHR vendor foundation tools or design of custom tools to address a clinical need.
Requests can be initiated and reported to the PM by clinic participants or any member
of the Sprint team.
Fig. 1 Clinic engagement and issue reporting begins 90 days prior to Sprint and occurs daily
during the Sprint.
Each request is logged in a clinic-specific Microsoft Excel workbook ([Appendix A]) by the PM or the Sprint analysts and these items are reviewed and updated daily
during Sprints. In each Sprint, there are two daily workbook review sessions. The
first daily workbook review is a 1-hour meeting with the analysts, PM, PI, and nurse
informaticist (CI). A second, more focused daily review with clinic and system leadership
takes place during the daily huddle, a 30-minute meeting typically held during the
lunch hour ([Fig. 1]). Typical attendees include the clinic medical director, clinical content leads,
clinic manager, charge nurse, lead medical assistant, business supervisor, ambulatory
EHR manager/director, and system business/operations representatives.
Appendix A
Sprint standard workbook tracking
|
Column
|
Column definition and purpose
|
Column categories
|
|
Subject
|
The clinical efficiency or the electronic health record (EHR) build team that will
address an end-user request. Allows sort for similar requests and helps avoid duplicate
entries
|
Cadence (scheduling), care everywhere (outside records), charging, clean-up, clinical
decision support, documentation, Dragon (speech recognition), flow sheets, Haiku/Canto
(mobile applications), in basket, inpatient, interface, letters, My Health Connection
(patient portal), navigator, Orders -Imaging, Orders- Labs, Orders – Medications,
Orders-Referrals, printing, patient-entered data, reports, secure chat, security/tool
access, smart tools, synopsis (disease-specific reports), telehealth
|
|
Priority
|
Determined primarily by the physician informaticist (PI), priority determines order
of work completion
|
High priority:
1. Patient safety concern
2. Good evidence to support loss of clinical revenue directly due to EHR workflow/tools
3. Clinic prioritized these 1–3 items for this Sprint (not applied if they prioritized
>3 items as high)
4. Request provides significant benefit to providers/staff in this clinic but would
also benefit the larger provider/staff community if this was fixed/improved
Medium priority:
1. Items that will improve “quality” of patient care or are current QI projects within
the clinic
2. Items that have a clinic champion and are more likely to be used and maintained
as a result
3. Subspecialty workflow important to one subgroup of staff/providers but not entire
clinic staff/providers (i.e., epilepsy flow sheet, lupus express lane)
4. Items that improve the efficiency of multiple individuals in the clinic
5. Items that provide some benefit to this clinic and also would benefit other clinics
6. Items that allow the individual or subgroup to use standard tools (synopsis, problem
list, ordering, edit/share/co-sign notes, Dragon, smart tools, charging)
Low priority: Items that do not fit into high or medium priority
|
|
Requestor
|
Clinician, staff, or operational leader who placed the request. Promotes understanding
of request and ability to close the loop with action taken
|
N/A
|
|
Request details
|
End user request in their words
|
N/A
|
|
Daily updates
|
Dated entries updating the Sprint team on when investigation, discussion, and action
have been taken on a request
|
N/A
|
|
Sprint owner
|
The Sprint team member who is taking the lead on an item
|
N/A
|
|
Status
|
Status indicates where the request is in our queue. A temporary status is applied
until a final status can be selected. All workbook items have a final status at the
end of the Sprint event
|
Final status includes the following:
Clinic-owned: workflow that needs further evaluation or education or clinic does not prioritize
during Sprint
Done: request was completed during Sprint
Not doing: request was not completed during the Sprint and a ticket was not placed to have this
item completed
Temporary status includes the following:
CI/PI owned: a PI or nurse informaticist will see the request through to completion. Analyst is
not needed
Parking lot: request has not yet been prioritized or assigned to an owner
Ambulatory prioritization and optimization meeting (PROM): request will impact additional stakeholders beyond this clinic/specialty and will
be brought to our weekly ambulatory EHR governance meeting for decision
To do- researching: request is being investigated
|
|
Necessary discussion
|
Discussion outside of the Sprint team is needed
|
Daily Sprint Huddle, PROM
|
|
Handoff team
|
Request needs to be completed by another team with a different skill set or governance
structure. Training team items are moved to a separate workbook tab and addressed
by the Sprint trainers
|
Integrated orders, beaker (laboratory), cadence (scheduling), interface, MHC (patient
portal), security, training, virtual health, willow (pharmacy)
|
|
Analyst tracking task#
|
An internal system used to track time spent by Sprint analysts on workbook requests
|
N/A
|
|
Ticket#
|
Number assigned to track build requests in or out of Sprints
|
N/A
|
|
Build buddy
|
The analyst that reviews build of another analyst before validating build with an
end user or PI/clinical informaticist (CI)
|
N/A
|
|
New build doc
|
Reminder to analysts to add new build to the build document that is left with the
clinic after Sprint
|
Yes, No
|
|
Workflow doc
|
Reminder to analysts to add new workflow information to the workflow document that
is left with the clinic after Sprint
|
Yes, No
|
|
Tip sheet
|
Reminder that tip sheet needs to be created for a workflow or tool
|
Yes, No
|
|
Training Wiki
|
Reminder to the training team to add an important new tool or workflow to our 1:1
training for this clinic
|
Yes, No
|
|
Wrap up doc
|
Reminder to PI/CI that this important tool or workflow needs to be demonstrated during
wrap up group session
|
Yes, No
|
Workbook items are prioritized through the lens of the clinician and staff, the clinic
as a whole, and the larger system because all stakeholders are involved. Priority
is elevated when the request is high priority to clinical leaders, concerns patient
safety, affects multiple clinic participants or high-volume workflows, and/or positively
impacts end users outside of the target clinic. Lower-priority requests include a
time-consuming build that does not meet the above criteria and/or which impacts the
workflows of only one or a small number of users. If there are too many requests to
accomplish during the allotted Sprint weeks, then the PI works with clinic leaders
to determine what can be accomplished during the Sprint and what requests will need
to be entered as general requests to non-Sprint EHR teams.
Using Agile project management principles,[32]
[33] EHR build is accomplished by EHR analysts who meet either directly with the requestor
or with the PI/CI who has met with the requestor. The build process is iterative throughout
the Sprint with clinicians and/or staff reviewing the EHR build and providing real-time
feedback. Clinicians have access to the onsite Sprint team to meet face-to-face, but
a significant amount of feedback is also received and updated via e-mail during the
Sprint. Notably, the CI/PI will help gain support and approval from specialty service
lines and EHR governance committees when there are invested stakeholders beyond the
participating Sprint clinic.
The build product is considered complete at the close of the Sprint event and further
iterations must go through the larger system optimization processes. At the end of
the Sprint, the clinic workbook remains with clinic leaders and contains workflow
documents explaining new build and critical information about outstanding items and
how to follow up (ticket number, responsible team).
Sprint Program Evaluation
For this Sprint program evaluation, four physician informaticists (PIs) and one family
medicine resident participated in the retrospective review of 20 Sprint workbooks
from UCHealth Sprint events conducted between May 2019 and January 2020. Each PI regularly
leads Sprints, actively participates in EHR governance, and has at least 4 years of
informatics experience. The family medicine resident reviewer helped design the study
and reviewed four workbooks in conjunction with a lead PI. Objective information such
as Sprint location, timing, participants, and clinic specialty was collected from
Sprint workbooks. Workbook item final status was also noted. To further describe the
types of clinical requests, two independent physician reviewers categorized the workbook
requests by (1) EHR team with primary responsibility for the request, (2) clinical
efficiency gained by addressing the request, and (3) type of EHR intervention needed
([Fig. 2]).
Fig. 2 Categorization of Sprint workbook items by physician reviewers.
The workbook notes indicated which EHR team was ultimately responsible for request
resolution.
Determining clinical efficiency gains required evaluating each request for what clinicians
or staff attained from the request completed. EHR intervention included break-fix
and new build as defined. “Clean-up” items are those directly solicited from clinics
and include refining existing build (i.e., removing departed providers from clinic
schedule view) to improve accuracy. EHR vendor enhancement requests are items that
cannot be built or changed locally without vendor intervention. Finally, workflow
solutions translate to training end users on existing EHR tools to enhance efficiency.
Each workbook item was annotated for each of these three measures by the first reviewer
and this annotation was unblinded to the second reviewer. The final categorization
of each item was determined through discussion by the two reviewers if there was disagreement.
Study clinics were classified as academic if they were staffed by School of Medicine
faculty and as community if their clinicians were employed through our affiliate community
practice group. A multitude of specialties were represented in this study and included
the following: primary care, gynecologic oncology, rheumatology, OBGYN, preprocedural
(preoperative) services, physical medicine and rehabilitation, neurosurgery, neurology,
interventional pain management, podiatry, orthopaedics, psychiatry, pediatrics, endocrinology,
infectious disease, urogynecology, and allergy. The primary care study group included
internal medicine, family medicine, and urgent care. The specialty group included
single same-specialty medicine clinics. The surgical group included single same-specialty
surgery clinics. Multispecialty groups included multiple different specialty and/or
surgical clinicians who work together at one practice site.
Results
On average, the Sprint team serviced 30 participants, clinicians (n = 13) and staff (n = 17), per week of Sprint. Twenty Sprint workbooks, including those from 9 academic
clinics and 11 community clinics, were reviewed. A total of 1,254 requests were received
from 407 clinicians and 538 staff over 31 weeks of Sprint. [Table 1] outlines participant demographics and Sprint workload.
Table 1
Participant demographics and corresponding workbook (WB) requests
|
Academic clinics
|
Community clinics
|
|
|
Clinics
|
Clinicians
|
Staff
|
Wk
|
WB requests
|
Clinics
|
Clinicians
|
Staff
|
Wk
|
WB requests
|
WB requests by specialty (% total requests)
|
|
Primary care
|
|
|
|
|
|
3
|
63
|
129
|
5
|
132
|
132 (11%)
|
|
Urgent care
|
|
|
|
|
|
1
|
24
|
45
|
2
|
51
|
51 (4%)
|
|
OBGYN
|
|
|
|
|
|
1
|
11
|
38
|
1
|
42
|
42 (3%)
|
|
Subspecialty
|
5
|
134
|
51
|
7
|
364
|
1
|
18
|
26
|
2
|
76
|
440 (35%)
|
|
Surgery
|
4
|
67
|
66
|
7
|
280
|
|
|
|
|
|
280 (22%)
|
|
Multispecialty
|
|
|
|
|
|
5
|
90
|
183
|
7
|
309
|
309 (25%)
|
|
Totals
|
9
|
201
|
117
|
14
|
644
|
11
|
206
|
421
|
17
|
610
|
1,254
|
Abbreviation: OBGYN, obstetrics and gynecology.
Primary care specialties (primary care and urgent care) requested 26 items per week
and nonprimary care specialties (surgery, medicine subspecialties, multispecialty
groups, and OBGYN) requested 46 items per week. Overall, 2.1 requests per primary
care provider and 3.4 requests per specialist provider were logged during Sprint.
Sixty-nine percent (872/1,254) of all clinic requests were completed during Sprint
([Fig. 3]). Issues were considered complete when end users indicated satisfaction with the
provided solution. Nineteen percent (236/1,254) of requests required referral to a
larger health system governance or other nonambulatory IT teams. These requests were
sufficiently vetted and content was fully prepared for build, so the majority of requests
were completed within 1 month following Sprint. The remaining 12% (145/1,254) of items
were either addressed or redacted by the clinic (52/1,254), determined after joint
discussion to not prioritize or complete (35/1,254), or could not be done due to high
degree of customization required to complete (58/1,254).
Fig. 3 Outcomes of Sprint workbook requests.
Of the 1,254 total Sprint requests, 46% (571/1,254) simply required modification of
existing EHR tools (break-fix and clean-up), 25% (309/1,254) required net new build,
and 2% (28/1254) required that the Sprint team ask the EHR vendor to work on solutions.
Twenty-seven percent (342/1254) of requests required investigation by members of the
Sprint team, but the request was satisfied by training on existing EHR tools ([Fig. 4]).
Fig. 4 Type of Sprint intervention required to address Sprint workbook requests.
To demonstrate the concept of clinical efficiency gains, [Table 2] outlines representative workbook requests that were satisfied by ambulatory EHR
analysts and/or the Sprint training team. A few examples of tool access requests managed
by the security team include the following: “I am a clinic manager and I need access
to flow sheets,” “our advanced practice providers (APPs) are unable to log into the
clinical guideline tool through the EHR,” and “our social workers cannot access synopsis
reports.”
Table 2
Representative examples of most common Sprint workbook requests and categorization
|
Ambulatory team
|
Training team
|
|
Charging
|
“Our charge list is too long; can it be shortened to only include what we do in our
clinic?”
|
“How do you create favorites for charges?”
|
|
Clinical review
|
“We need our own synopsis report for interventional pain, because the general pain
synopsis doesn't include our procedures”
“This synopsis report is too long and requires too much scrolling! Can synopsis report
sections appear collapsed by default?”
|
“Is there a way for me to more easily find labs relating to my specialty?”
“Too many scanned results live in the media tab and we have to hunt and peck to find
them; they don't align with other testing sections (radiology, labs, etc.)”
“How can I rearrange the problem list so I see my specialty-related problems on top?”
|
|
Documentation
|
“We need custom note templates for new and return spine patients”
“Menstrual history section looks different for providers and staff and the information
in each does not communicate with the other”
“The links you built for BMI and LMP don't work, even though that discrete data are
available in the chart”
“Can we add the Hoffman exam to our physical exam template?”
|
“We would like the MA to populate the provider note with patient-entered HPI/ROS/annotated
images; do they need special security to do so?”
“Why don't we all get those special Dragon (speech recognition) microphones?”
“Add SCORAD (allergy) flow sheet to the EHR”
“Add link to bring Asthma Control Test information into the provider note”
|
|
In basket
|
“Why can't my patients find me to send a message through the EHR?”
“Unable to close two open encounters, erroneous smartset (existing workflow for this)
did not work”
“My in basket messages are going to the wrong pool (group)! I work in 3 different
clinics and need the patient messages to route to the clinic where I saw that patient”
“Dr. T reports office notes are not auto-routing to referrers”
|
“How can we change from paper surgical case requests and send them to our schedulers
in a way that we can find those messages later?”
“What is the fastest way to tell when a patient has the next appointment in my clinic
from in basket?”
“Providers would like to have ‘reply’ and ‘reply all’ arrows on every in basket message
type”
|
|
Mobile
|
“We need to add low back pain questions to the existing neck pain questionnaire for
new patients”
“Dr. W requested that we add insurance status to summary report in EHR mobile application”
|
“When Dr. Y uses Haiku, he gets an error message; does license need reactivation?”
|
|
Ordering
|
“Why do I need to look in the (scanned) media tab for outside labs and can't see them
in the labs section of chart review?”
“All orders with CT INJ should have a synonym of guided”
“Provider X signature is not showing when printing or reprinting orders”
|
“We need an orders preference list that is designed for our specialty”
“Patch testing workflow is difficult; can we improve ease of ordering?”
“Provider X notes that he gets a “pop up” that he cannot get past when trying to order
pain injections”
|
|
Research
|
|
“Can we create a referrals report?”
|
|
Schedule
|
“Nurses X and Y need to be added as schedulable resources for AMC pain and spine clinic”
|
Newer staff need assistance with scheduling at different clinic locations
|
Clinical efficiencies most commonly gained with workbook requests included documentation
(28% [350/1,254]), ordering (20% [255/1,254]), in basket (17% [217/1,254]), and clinical
(chart) review (15% [188/1,254]) efficiency ([Table 3]). Typically resolved by security or HIT hardware teams, updated access to existing
EHR tools represented 8% (103/1,254) of the clinical efficiency gains. The EHR ambulatory
team worked the majority of Sprint requests (47% [590/1,254]), with training (15%
[192/1,254]) and security teams (14% [175/1,254]) being the next largest contributors
([Table 3]). Requests evaluated by ambulatory, hardware, security, and training teams comprised
80% (1,002/1,254) of reported items and these are typically the items completed during
or directly after Sprint.
Table 3
Number of total workbook requests categorized by clinical efficiency gained and by
the electronic health record (EHR) application teams assigned
|
Documentation
|
Ordering
|
In basket
|
Clinical (chart) review
|
Tool access
|
Schedule
|
Charging (billing)
|
Research
|
Mobile device
|
Total requests assigned to each team
|
|
Ambulatory team
|
247
|
93
|
102
|
127
|
14
|
4
|
1
|
1
|
1
|
590
|
|
Clinical decision support team
|
|
1
|
|
3
|
|
|
|
1
|
|
5
|
|
Hardware team
|
9
|
18
|
1
|
1
|
14
|
2
|
|
|
|
45
|
|
Health information team
|
|
|
|
1
|
|
|
|
|
|
1
|
|
Inpatient team
|
3
|
1
|
|
|
1
|
1
|
|
|
|
6
|
|
Interface team
|
|
2
|
1
|
1
|
1
|
1
|
|
|
|
6
|
|
Laboratory team
|
1
|
13
|
1
|
16
|
|
|
|
|
|
31
|
|
Patient portal team
|
10
|
2
|
30
|
2
|
|
|
|
|
|
44
|
|
Mobile device team
|
1
|
|
1
|
2
|
3
|
1
|
|
|
|
8
|
|
Pharmacy team
|
|
19
|
1
|
2
|
|
|
|
|
|
22
|
|
Radiology team
|
2
|
12
|
|
|
2
|
3
|
|
|
|
19
|
|
Reporting team
|
|
|
|
2
|
|
1
|
|
10
|
|
13
|
|
Revenue team
|
|
1
|
|
1
|
1
|
2
|
27
|
|
|
32
|
|
Scheduling team
|
1
|
4
|
|
4
|
2
|
47
|
4
|
3
|
|
65
|
|
Security team
|
9
|
24
|
67
|
7
|
61
|
3
|
3
|
1
|
|
175
|
|
Training team
|
67
|
65
|
13
|
19
|
4
|
16
|
2
|
3
|
3
|
192
|
|
Total requests by clinical efficiency (% total)
|
350 (28%)
|
255 (20%)
|
217 (17%)
|
188 (15%)
|
103 (8%)
|
81 (6%)
|
37 (3%)
|
19 (2%)
|
4 (0.03%)
|
1,254
|
The specialty groups requested a higher percentage of items that earned them clinical
review (16 vs. 10%) and documentation (29 vs. 23%) efficiencies compared with their
primary care colleagues who requested slightly more order modifications (22 vs. 20%;
[Fig. 5]).
Fig. 5 Percent of total workbook requests primary versus specialty care.
Discussion
When EHR optimization is anchored on business priorities, EHR upgrades, and help desk
tickets, the lens of practicing clinicians and staff is overlooked. As a result, optimization
of EHR software is suggested by an EHR vendor or a few outspoken individuals and it
is followed by usability concerns and increased clinician burnout.[17] Engaging stakeholders, gaining consensus, and analyzing workflows are labor intensive
compared with the build and distribution of a vendor-developed EHR upgrade product.
With our novel Sprint clinic-centered approach, we authenticate EHR optimization.
We adhere to recognized optimization principles[12]
[17]
[18]
[19]
[20]
[21]
[22]
[23]
[24]
[25]
[26]
[27] and move beyond training alone using Agile project management to analyze workflows,
configure software, and decrease EHR burden.[28] In Sprint, we create a burning platform, lead with compassion, facilitate and support
change, and set high expectations for stakeholder engagement. By focusing EHR optimization
on usability and clinical efficiency gains, the impact locally and at the system level
is measurable and informs HIT processes and priorities.
Sprint clinic-specific optimization relies on direct face-to-face interaction with
a high number of clinicians and staff. Operational leaders, medical directors, content
experts, and super-users build consensus around clinic requests for workflow optimization
and software configuration. These requests are tracked by Sprint PM and vetted and
prioritized by the clinic with direct guidance from physician and nurse informaticists
and ambulatory EHR analysts. The majority of requests (88%) are completed during or
directly after Sprint is completed so trainers and informaticists can provide at-the-elbow
education during the Sprint on all new or updated tools. Unique to clinic-centered
optimization is accountability, which is important when issues are incomplete, inaccurate,
or deserve special recognition. The Sprint team coexists in the clinic, so discussion,
apologies, and gratitude are directly expressed. This process improves the IT–clinic
relationship by creating empathy and understanding between EHR analysts, staff, and
clinicians. One medical director noted:
“The Sprint was absolutely amazing! So helpful in every way possible. It helped all
of our disciplines (doctors, advanced practice providers, social workers, chaplain,
nurse, schedulers, etc.). We received extremely practical and doable tips, concepts,
workflows, etc., every day. It gave our team members hope that the system cared about
them, wanted things to be easier, and less burdensome. [We] gain[ed] clinically relevant
knowledge, and made things so much better than where we started.”
The traditional “squeaky wheel” effect that occurs with traditional help desk processes
is less apparent as recommendations and decisions are made and workflows converge
at the clinic level.
Conceptually, our “squeaky wheels” are advantageous because they are Sprint informaticists
and analysts who participate both in Sprints and in system governance. During Sprint,
these individuals share new ideas and build with colleagues. After Sprint, they bring
recommendations to system governance for role- or application-specific innovations
that provide benefit to the larger health system. For example, documentation-focused
requests are often unique to specialty and role; thus, they are typically created
for specific users. Alternatively, in basket requests are often application specific
and tool access requests are often role specific. Thus, a solution that provides innovative
in basket functionality is slightly delayed in system governance after Sprint, but
then it is implemented for everyone in the health system who uses that EHR application.
User requests indicating lack of access to an existing EHR tool are similar in that
the Sprint team typically finds that all users in that same role are also lacking
the requested tool. Order-based requests can benefit one clinic (e.g., build a new
department order preference list) or many clinics (e.g., add creatine clearance to
the chest computed tomography [CT] with contrast order). In each of these examples,
our Sprint team members, our desired “squeaky wheels,” proactively scale usability
and innovation that augments our traditional, more reactive approach to help desk–based
EHR optimization.
For this evaluation, we categorized Sprint requests as clean-up, break-fix, workflow
investigation, or new build. Given that 15% of requests required software clean-up
only, it is possible that health care organizations could reduce EHR burden by consistently
prioritizing EHR clean-up on a regular schedule creating a push intervention rather
than a pull. Similarly, adding clinical knowledge to our user provisioning team (i.e.,
determines which EHR template is seen by which role) and ambulatory EHR teams could
help prevent 8% of the clean-up requests that involved lack of access to an existing
EHR tool. We also determined that measuring net new build alone significantly underestimates
the contributions of the software analyst to overall EHR optimization efforts. Of
the total 1,254 Sprint requests, only 25% of requests required novel EHR build, but
an additional 73% required technical investigation and/or solutions. Our evaluation
suggests that workflow investigation and software clean-up are more time intensive
than net new build. Fortunately, with Sprint, clinical end users are present and engaged
and Sprint leaders can assist analysts with some of this work.
Sprint physician informaticists actively practice in outpatient clinics. Thus, the
Sprint team leaders understand that “there are special challenges with the ambulatory
setting”[17] and that “the same [EHR] application has to support different users who work in
different contexts to accomplish distinct goals.”[34] The Sprint team supports these unique goals while also capitalizing on the clinical
expertise and workflows of a variety of clinical groups. For example, specialty clinics
often champion the creation of disease-specific summary reports during Sprint. Since
primary care clinicians treat or co-manage similar diseases, they are introduced to
these reports during their Sprints. A similar impact can occur between specialties.
For example, gastroenterologists may not prioritize a metabolic bone disease report,
but if the report was introduced to them, they would use it to follow bone health
in their at-risk patients with inflammatory bowel disease. Our data support this observation
that primary care benefits from specialty build in that primary care groups request
a lower total number of items during Sprint. Specialty clinics also request more custom
note templates (documentation efficiencies) given their inherently limited scope compared
with primary care. In contrast, the Sprint team capitalizes on the cohesive, team-based
workflows of primary care and spreads these core ideas in specialty Sprints.
Optimization is variably defined in the literature and by health care organizations.
Therefore, the process and impacts of optimization require continued study.[16] In our published research on Sprint training, we showed increased user satisfaction,
improved teamwork, and decreased EHR burden after Sprints.[28] In this program evaluation, we describe how we implement the key principles of EHR
optimization to innovate bidirectionally to achieve optimization that positively affects
individuals, clinics, and the organization as a whole. We overcome the IT productivity
paradox (i.e., simply increasing investments, without clear guiding principles and
end points, which can lead to worsened productivity) with a commitment to and investment
in EHR training, EHR build, and process redesign.[20] Sprint user-focused innovations and designs regularly feed into a health system
governance process designed to benefit the larger clinician and staff community.
Limitations
It is difficult to compare the yield of our prior, more traditional, institution-wide,
“help desk”–driven EHR optimization process to the yield of our clinic-specific Sprint
EHR optimization process for several reasons. Our institution has grown exponentially
over the past 10 years and, therefore, it is impossible to produce a static yield
for comparison. A status of done is also defined differently by our help desk (i.e.,
request completed or user did not respond to outreach) than by the Sprint team (i.e.,
end user signs off on every solution). In addition, we track help desk tickets and
other requests by completion status only and clinical efficiency gains are not tallied.
Break-fix and new build are almost universally prioritized before EHR “clean-up” in
the traditional optimization model and thus a status of “clean-up” is not measured
for comparison. Finally, help desk calls can often be solved with training and most
of the Sprint training effort is not tracked within our workbooks.
Training is integral to Sprints and frequently issues are solved with training before
they can be added to the Sprint workbook. We have not made an attempt to capture every
training solution because collecting this information would sacrifice time spent assisting
end users. Thus, the Sprint workbook requests that were solved with training are certainly
an underestimate of total training effort. Instead, they represent issues that our
trainers or informaticists were not able to solve immediately without dedicated time
to investigate. In these instances, the investigation process and solution are tracked
to completion in the Sprint workbook.
Generalizability to smaller institutions without adequate funding to support a Sprint
team is difficult. We believe the strength, experience, and camaraderie of our team
members are more important than the total number of individuals, but our large team
does allow us to create a burning platform and move swiftly yet meaningfully through
clinical areas. It is also difficult to generalize about particular specialties when
we did not include a large number of clinics representing a single specialty.
Conclusion
This program evaluation demonstrates an EHR optimization process that successfully
adheres to ideal optimization principles for health care organizations.
Clinical Relevance Statement
Clinical Relevance Statement
Key principles of EHR optimization are well described in the literature, but descriptions
of EHR optimization programs and processes are sparse. This evaluation highlights
the work products and experience of one comprehensive and long-standing program at
a large, integrated health network.
Multiple Choice Questions
Multiple Choice Questions
-
What do ideal EHR implementation, upgrade implementation, and optimization processes
have in common?
-
They require stakeholder involvement.
-
They require training and education.
-
They require clinical expertise.
-
They require a focus on usability.
-
All of the above.
Correct Answer: The correct answer is option e. EHR implementation, upgrades, and optimization exist
as a continuum. Health care organizations are forever growing, merging, and changing,
but the key principles behind a successful interface between the EHR team and health
care team over time are the same. The most usable, efficient and successful solutions
occur when we adhere to these principles, in addition to insuring committed leaders,
dedicated resources, a clear vision, and a strong clinical informatics team.
-
How do Agile project management principles help promote improved EHR optimization
processes?
-
Agile takes away the need for a PM.
-
Agile promotes clean workbook request tracking.
-
Agile supports product iteration during software build.
-
Agile provides team-building strategies for multidisciplinary teams.
Correct Answer: The correct answer is option c. Agile project management is key to EHR optimization
because Agile promotes direct interface with the issue requestor, respecting and incorporating
their feedback into the final EHR build product or workflow solution. In addition,
unlike traditional “helpdesk” optimization processes, the product or solution is not
complete until a two-way conversation is had between the requestor and the person
satisfying the request. Oftentimes, helpdesk tickets are closed by an IT staff member
working remotely, and the solution is not sufficient or agreed upon by the end user.