Keywords clinical decision support systems - usability testing - human-centered decision -
clinical workflows
Background and Significance
Background and Significance
Inflammatory bowel disease (IBD) affects up to 300,000 children in the United States[1 ] and is characterized by gut inflammation leading to bloody diarrhea, weight loss,
and multisystemic complications.[2 ] The most common IBD complication is iron deficiency anemia (IDA),[3 ] which worsens quality of life[4 ] and developmental outcomes in children.[5 ] Guideline-based IBD IDA care involves annual screening for anemia and treatment
with iron when deficiency is present, but adherence to guidelines is often poor.[6 ]
[7 ]
Clinical decision support systems (CDSSs) integrated into electronic health records
(EHRs) provide an opportunity to deliver evidence-based recommendations that are well
integrated into systems of care.[8 ]
[9 ]
[10 ]
[11 ] Human-centered design (HCD) has been applied to the development of CDSS[12 ]
[13 ]
[14 ]
[15 ] to improve integration of CDSS tools into clinical workflows and to improve usability
with the goals of improving patient safety, enhancing clinical outcomes, and improving
process efficiency, among other aims.[12 ]
[13 ]
[16 ]
[17 ] HCD involves in-depth analysis of work systems and care processes[12 ]
[16 ]
[18 ] to inform design of a CDSS prototype, which undergoes iterative redesign using human
factors (HF) principles,[12 ]
[19 ]
[20 ] taking into account the five “rights” of clinical decision support (CDS).[21 ] Key gaps in the research of HCD of CDSS include methods of translating requirements
into build and demonstration of how HCD contributes to design.[14 ]
[22 ]
Objective
The objective was to use HCD methods to evaluate baseline care practices to inform
the iterative design of a CDSS that was integrated into clinical workflows and designed
to improve IBD IDA care, the IBD Anemia Diagnosis Tool or IADx.
Methods
Organizational Setting
This study was conducted in the Pediatric Gastroenterology Division at Johns Hopkins
University School of Medicine.[23 ] At the time of the study, the division had 12 full-time faculty, 6 fellows, and
1 nurse practitioner who provided care to 500 children with IBD. All research was
conducted with approval of the institutional review board. An interdisciplinary team
was assembled including IBD experts (M.O.-H., S.H.), EHR software developers (S.D.M.
and A.M.), health informaticians (S.D.M., H.P.L., Z.M., J.H.G., and P.N.), and an
HF engineer (A.P.G.). [Table 1 ] summarizes the methodology used to carry out the HCD and formative evaluation of
IADx.
Table 1
Summary of data collection methods and aims
Data collection methods
Time completed
Participant type
Research aims
Semi-structured interview
Prior to CDSS development
Clinician champion
Create process map, identify goals of CDSS
Concurrent “Think Aloud” while completing experimental task
Day of simulation
Clinician end-users
Formative usability evaluation
Direct observation of task difficulty
Day of simulation
Clinician end-users
Formative usability evaluation
SUS survey
Day of simulation
Clinician end-users
Formative usability evaluation
Semi-structured interview
Day of simulation
Clinician end-users
Formative usability evaluation
Abbreviations: CDSS, clinical decision support system; SUS, system usability scale.[34 ]
Process Mapping
To determine the approach to anemia screening by clinicians, semi-structured interviews
were performed with a junior and senior IBD clinician (S.D.M. and M.O.-H.) in the
Pediatric Gastroenterology Division using an interview guide (see [Supplementary Material A ], available in the online version). The interviewer used responses to draft a process
map using draw.io,[24 ] which was validated in a follow-up interview.
IADx Prototype Design
A 1-hour virtual design meeting was held[25 ] with health informaticians and clinicians (S.D.M., H.P.L., and A.M.) in which the
process map was used to identify tasks that could be supported by CDSS. The group
selected an intervention involving an interruptive alert with a linked order set.
A prototype was designed on paper and shown to senior IBD researchers (M.O.H. and
S.H.) for feedback. Over a series of eight additional 30-minute design sessions, S.D.M.
and A.M. built a functioning EHR-based CDSS prototype in the EpicCare Ambulatory EHR.[26 ] Conditional rules were created to identify patients with IBD. Evidence-based laboratory
thresholds[27 ] were used to determine whether or not patients had IDA and Health Maintenance Plans
in Epic were linked to laboratory monitoring tests. When the patient was due for at
least one laboratory or had IDA within the prior year and no iron prescription, a
best practice advisory (BPA) with a linked order set with preselected laboratories
and medications was presented to the clinician.
Simulation-Based Usability Evaluation of the IADx Prototype
Simulation-based evaluation was performed to iteratively refine the IADx prototype
with clinician end-users using a “Think Aloud” technique where users verbalized thoughts
while using IADx, a postexperimental short survey and a semi-structured interview
(see [Table 1 ]; [Supplementary Materials B, C, and D ], available in the online version).[28 ] The feedback was recorded, transcribed, and analyzed for common themes, which were
incorporated into the redesign.[8 ]
[29 ]
Participants
Participants were recruited via email from active providers with purposive sampling
to include a variety of roles and experience levels. No compensation was offered.
Clinical Scenarios and Tasks
IADx was tested in Epic TST at the Johns Hopkins University Simulation Center using
a “computer on wheels” workstation. A member of the study team (Z.M.) sat next to
the participant to guide the session, conduct observations using an observation guide,
perform semi-structured interviews, and record responses (see [Supplementary Materials B, C, and D ], available in the online version).
Each end-user tested two simulated in-person office visits and two asynchronous In
Basket encounters to review laboratory results. Simulated patient records varied by
(1) status of anemia screening (screened vs. unscreened), (2) presence of IDA (IDA
vs. no IDA), (3) presence of iron treatment (treated IDA vs. untreated IDA), and (4)
interruptive vs. noninterruptive alert. Order of encounters was randomized using a
modified Latin square method.[30 ]
Measures and Data Collection
Audio and video of the participant and computer screen were recorded using B-Line
Sim Capture,[31 ] and other data were collected on REDCap.[32 ]
[33 ] After completion of all four scenarios by each participant, the System Usability
Scale (SUS)[34 ] was administered and semi-structured interviews were performed (see [Supplementary Material D ], available in the online version).
Data Analysis for Iterative Redesign of IADx
After each session, audio recordings were transcribed by study staff and independently
coded (S.D.M. and Z.M.) according to a priori defined categories of usability, visibility,
workflow, content, understandability, practical usefulness, medical usefulness, and
navigation (see [Supplementary Material E ], available in the online version).[8 ] Coders met to reconcile differences. The coded data were interpreted using HF principles
including the five “rights” of CDS[21 ] with an HF engineer (A.P.G.), and incorporated into tool redesign. Across each of
the two rounds of IADx redesign, participants were recruited until all categories
of “Think Aloud” feedback achieved theoretical saturation.[35 ] Task completion and perceived difficulty rates as well as SUS score were compared
between rounds by Fisher's exact test and t -test methods using R version 3.5.2.
Results
Process Map
Screening occurs in the context of clinic visits, wherein the provider must remember
to consider screening, order laboratories, and remember to follow-up results. Iron
treatment was initiated on review of laboratory tests during clinic visits or In Basket
encounters ([Fig. 1 ]) and required calculation of dosage, ordering of follow-up laboratories, and patient/family
education.
Fig. 1 Process map for care of children with anemia and inflammatory bowel disease before
and after IADx. Black lines indicate baseline and post-IADx workflow. Red lines are replaced by IADx. Blue lines indicate actions taken over by IADx. IADx, IBD Anemia
Diagnosis Tool; HMP, health maintenance plan; IBD, inflammatory bowel disease; IDA,
iron deficiency anemia.
IADx Design
During the initial design meetings, IADx was created to act at the steps of screening
for IDA, interpretation of laboratory results, and ordering of iron and repeat IDA
screening for patients found to have IDA ([Fig. 1 ]). The initial IADx prototype displayed laboratories for which the patient was due
and linked to an order set containing laboratory orders, iron orders, and patient
instructions, which were preselected and editable by the user.
“Think Aloud” Usability Testing Results
Six providers participated across two rounds of testing. Participant characteristics
except for years in practice were similar between rounds ([Table 2 ]).
Table 2
Participant demographics
Round 1 (N = 3)
Round 2(N = 3)
Years in practice, mean [range]
8.3 [3.0–13.0]
17.5 [1.5–26.0]
Years at Hopkins, mean [range]
16.3 [3.0–37.0]
16.5 [1.5–25.0]
Role n (%)
Attending
1 (33.3%)
2 (66.7%)
Fellow
1 (33.3%)
1 (33.3%)
Nurse practitioner
1 (33.3%)
–
Female, n (%)
2 (66.7%)
2 (66.7%)
BPA experience, n (%)
Low
2 (66.7%)
2 (66.7%)
Medium
1 (33.3%)
1 (33.3%)
High
–
–
Abbreviation: BPA, best practice alert.
[Table 3 ] summarizes “Think Aloud” and semi-structured interview comments across two rounds
of testing with related design choices, HF principles, and process implications. [Fig. 2A and B ] contain screenshots of IADx with highlighted HF redesign principles. Coded provider
feedback was incorporated into iterative redesign. In round one, providers desired
to “see a trend” of laboratories and normal values to make treatment decisions, so
laboratory trends and normal ranges were added. End-users raised concerns about redundant
and confusing features in the tool including laboratory due dates, so these were removed.
Providers reported not using oral iron to treat IDA but preferred intravenous (IV)
iron with desire for auto-calculation of the treatment dose of IV iron.
Fig. 2 IADx across two iterations. (A ) 1. Interruptive alert after one round of feedback. 2. Best practice advisory patient
data summary after one round of feedback. 3. Best practice advisory data summary and
laboratory orders after one round of feedback. 4. Best practice advisory laboratory
and medication orders after one round of feedback. IADx build after incorporating
feedback from round one of usability evaluation. (B ) 1. Interruptive alert after two rounds of feedback. 2. Normal values' table after
two rounds of feedback. 3. Best practice advisory medication orders after two rounds
of feedback. IADx build after incorporating feedback from round two of usability testing.
(C ) No legend. (D ) Total iron binding capacity. (E ) No legend. IADx, IBD Anemia Diagnosis Tool.
Table 3
End user feedback mapped to design principles and process implications
End user feedback (round)
Usability dimension
Design choice or feature
Human factor design principle
Potential process implications
“I cannot remember every [normal value] off the top of my head.” (1)
Usability
Include normal value ranges.
Dependence on memory[51 ]
Facilitates provider information analysis of laboratory data by reducing need to memorize
or look up normal laboratory values.
“So here's an iron order in the smartset?” (1)
Visibility
Make iron orders more visible.
Enhance visibility[52 ]
Automation of decision selection[46 ]
Facilitates provider decision selection by making iron orders more prominent.
“It says some blood is due in December 2018, which hasn't yet happened.” (1)
Content
Remove dates laboratories are due.
Unnecessary information[51 ]
Reduces time spent using the tool.
“It's confusing for providers to do that equation.” (1)
Practical usefulness
Automatically calculate iron treatment dose by weight.
Build in calculators[51 ]
Automation of information analysis[46 ]
Facilitates provider information analysis by automating a complex calculation of iron
treatment dose.
“See[2 ]
[3 ] [hemoglobin] values before that so you can see a trend.” (1)
Medical usefulness
Include trends for laboratories.
Patient information summary[51 ]
Automation information acquisition[46 ]
Facilitates provider information analysis and decision selection by providing laboratory
information in the context of prior trends.
“So it looks like he is on fer-in-sol so in this circumstance…I would definitely transition
him to the IV iron” (1)
Medical usefulness
Do not auto-select medication orders
Automation action implementation[46 ]
Preserves provider autonomy by including options of iron medications to order, but
leaving the ultimate decision of whether or not to order these up to the provider.
“Here I see there's normals. I didn't notice that before.” (2)
Visibility
Make normal value chart more easily visible
Information order and layout[51 ]
Facilitates provider information analysis of laboratory data by making the normal
value chart more prominent.
“Why do I have to click that and then have a comment box enter?” (2)
Usability
Remove the comment box and acknowledge reason.
Minimize workload[53 ]
Reduces time spent using the tool.
“The pop-up really brings your attention to the anemia.” (2)
Visibility
Use alert reminder.
Alert timing[51 ]
Increases likelihood of provider awareness of anemia in patient population.
“It would be helpful if that took me to the alert.” (2)
Workflow
Link to the BPA section from the alert.
Alert timing[51 ]
Reduces time spent using the tool.
“I presume they are not on iron at this time, right? (2)
Understandability
Include whether or not patient is on iron in the BPA.
Match between system and world[52 ]
Automation of information acquisition[46 ]
Facilitates provider information analysis by automating information acquisition.
“It would also be nice if the treatment plan would actually show up here.” (2)
Navigation
Include functioning link to iron infusion order set.
Number of steps[51 ]
Consistency and standards[52 ]
Reduces time spent using the tool by embedding links to related order sets.
Abbreviation: BPA, best practice alert; IV, intravenous.
Comments from round two affirmed that end-users felt a pop-up alert would be preferred
as it “really bring your attention to the anemia,” and that the pop-up should fire
at the time of order entry. “Think Aloud” feedback showed that providers preferred
a more streamlined CDSS without a comment box in the pop-up alert but with a link
to pertinent order sets. In addition to laboratory trends, providers requested data
on iron prescriptions, which were added. See [Table 3 ] and [Fig. 2B ] for examples.
SUS scores were similar across rounds with mean (standard deviation) scores of 77.5
(5.0) in round one and 75.0 (7.5) in round two, with no major change in score on any
individual question. Most activities were completed easily or with little difficulty
except placement of laboratory orders, which was difficult 33% of the time in round
one and 0% in round two.
Discussion
Here, we present the HCD of IADx from development of a process map through iterative
prototyping with formative usability evaluation to create a usable CDSS tool. We show
how HF principles can be incorporated into CDSS creation at each stage of prototype
development[22 ] within a commercial EHR.[12 ]
[36 ]
Lessons Learned
A striking result of the “Think Aloud” feedback was the provider preference for an
interruptive alert. Prior studies have shown that interruptive alerts are frequently
disabled[37 ]
[38 ] due to inappropriate alert triggering, poor usability, or poor fit with the workflow,[39 ] with recommendation to avoid using interruptive alerts for nonemergent CDSS.[40 ]
[41 ] There is some evidence that interruptive alerts do cause more change in provider
behavior than noninterruptive alerts,[39 ] giving credence to the idea that a noninterruptive IADx would commonly be ignored.[42 ] Provider preference may partly be explained by the fact that users were only offered
the options of either an interruptive or noninterruptive BPA. A noninterruptive alert
could be preferred, as in the Vitamin D soft stop reminder at order entry in Hendrickson
et al,[43 ] or the dynamic order set to impact inpatient blood utilization as in Ikoma et al.[42 ] Alternatively, providers may tolerate this interruptive alert since it was customized
to local need, occurred at the time in the workflow, and contained sufficient information
for decision making.[21 ]
[44 ]
[45 ]
“Think Aloud” testing showed the provider preference for high levels of automation
of information acquisition and analysis with some automation of decision selection
(laboratories but not medications), and no automation of action implementation.[46 ] In the case of IADx, the low risk of laboratory ordering and the higher risk of
iron medication may have contributed to this preference. Other studies on HCD of CDSS
tools similarly showed that providers valued auto-retrieval of key data and prepopulation
of available clinical information[16 ] but preferred that ultimate decisions about what to do with the automatically gathered
and analyzed information be left to the end-user.[12 ]
Using HF principles to interpret end-user feedback for iterative redesign of IADx
resulted in a tool that put everything needed for a decision on a single screen[16 ] through medication history in the linked BPA, patient information summary, laboratory
trends with normal ranges, and built-in calculators. We would have liked to format
the laboratory trends and normal ranges in a more visually elegant tabular display,
but limitations of the EHR environment precluded this. A future IADx iteration could
be implemented using web-based tools including CDSHooks[47 ] and Smart on FHIR[48 ] to allow for scaling across sites and improved flexibility in the presentation of
patient-level data pertinent to decision making.
Limitations
This single-center study created a tool in response to local factors, which may limit
the generalizability. Due to time and funding constraints, the baseline workflow evaluation
only incorporated interviews with two physician practitioners rather than more in-depth
methodology,[49 ]
[50 ] potentially missing opportunity for interventions besides an interruptive alert.
The small number of volunteer participants may introduce self-selection bias for end-users
interested in technology. The study focused on feedback from provider end-users, missing
input from other members of the care team. This study was limited only to the formative
design of a CDSS rather than outcomes of implementation, which is of critical need
to steer HCD methodology in directions that produce not just usable and workflow-integrated
tools but ones that improve real-world clinical outcomes. Many of these limitations
can be addressed in a future multi-center CDSS development and implementation effort.
Conclusion
This study presents the HCD of IADx, a tool to improve IDA care in children with IBD.
The principles of high levels of automation of information acquisition and analysis
but more limited automation of decision selection and action implementation may be
broadly applicable to the development of other CDSS tools. In some cases, locally
developed CDSS that is well-integrated into existing workflows may be usefully deployed
as an interruptive alert when preferred by providers.
Clinical Relevance Statement
Clinical Relevance Statement
HCD helps develop CDSS tools that are integrated into clinical workflows. Practitioners
want CDSS that automates information acquisition and analysis, but prefer that clinical
decisions be left to the provider. CDSS that is well integrated into the workflow
can in some cases be implemented as an interruptive alert, even for nonemergent clinical
processes.
Multiple-Choice Questions
Multiple-Choice Questions
What are the benefits of human-centered design of clinical decision support systems?
Increased clinical throughput.
Improved integration into clinical workflows.
Shortened software development lifecycle.
Reduced requirement for clinical support staff.
Correct Answer: The correct answer is option b. Explanation: Human-centered design (HCD) has been
applied to the development of clinical decision support systems (CDSS) as it brings
a focus to people, tasks, and systems and seeks to incorporate human factors principles
into design. HCD has been shown to improve integration of CDSS tools into clinical
workflows and to improve usability. HCD involves in-depth analysis of work systems
and care processes to understand cognition and tasks. This evaluation informs the
design requirements and development of a CDSS prototype, which undergoes iterative
redesign using HF principles by a multidisciplinary team, incorporating feedback end-users
on CDSS design characteristics.
In the case of IADx, which cognitive processes did practitioners want to be addressed
by the clinical decision support system?
Information analysis
Action implementation
Machine learning
Process redesign
Correct Answer: The correct answer is option a. Explanation: Clinician end-users universally desired
that the clinical decision support system (CDSS) would automate information acquisition
and information analysis. Providers were somewhat split about whether the CDSS should
automate decision selection with preference to select laboratories for subsequent
provider signature but not medications. Providers did not want automation of action
implementation by the CDSS, such as having the system actually place orders.