Subscribe to RSS
DOI: 10.1055/a-2444-0342
Electronic Health Record User Dashboard for Optimization of Surgical Resident Procedural Reporting
Authors
Funding This study was supported by the U.S. Department of Health and Human Services; National Institutes of Health; U.S. National Library of Medicine (T15 LM007450).
Abstract
Background While necessary and educationally beneficial, administrative tasks such as case and patient tracking may carry additional burden for surgical trainees. Automated systems targeting these tasks are scarce, leading to manual and inefficient workflows.
Methods We created an electronic health record (EHR)-based, user-specific dashboard for surgical residents to compile resident-specific data: bedside procedures performed, operative cases performed or participated in, and notes written by the resident as a surrogate for patients cared for. Usability testing was performed with resident volunteers, and residents were also surveyed postimplementation to assess for efficacy and satisfaction. Access log data from the EHR was used to assess dashboard usage over time. Descriptive statistics were calculated.
Results The dashboard was implemented on a population of approximately 175 surgical residents in 5 different departments (General Surgery, Obstetrics and Gynecology, Neurosurgery, Orthopaedics, and Otolaryngology) at a single academic medical center. Six resident volunteers participating in usability testing completed an average of 96% of preset tasks independently. Average responses to five questions extracted from the System Usability Scale (SUS) questions ranged from 4.0 to 4.67 out of 5. Postimplementation surveys indicated high resident satisfaction (4.39 out of 5) and moderate rates of use, with 46.4% of residents using the dashboard at least monthly. Daily use of the dashboard has increased over time, especially after making the dashboard a default for surgical residents.
Conclusion An EHR-based dashboard compiling resident-specific data can improve the efficiency of administrative tasks and supplement longitudinal education.
Keywords
surgery - residents - surgical education - electronic health record - Epic - dashboard - case loggingBackground and Significance
Despite ongoing national reform in graduate medical education, the job of a surgical trainee, or “resident,” continues to be demanding. Weekly work hours approach and sometimes exceed the 80-hour limit, and overnight call responsibilities can be as frequent as every third night.[1] [2] Many surgical teams also experience high patient-to-provider ratios, adding to overall workload and stress. Furthermore, surgical residents must prepare for surgical cases and ultimately learn to become independent and capable in the operating room. It is no surprise that surgical trainees are at a high risk for physician burnout.[3] [4] Given this situation, additional administrative tasks not directly pertaining to patient care add undue burden. Nevertheless, residents have multiple such requirements that are ultimately necessary to their longitudinal education.
One critical administrative task required for accreditation and quality control by the Accreditation Council for Graduate Medical Education (ACGME) is that all surgical residents must log both bedside procedures and operative cases to graduate from residency. The current system of case logging requires manual entry of at least nine data points (including case/patient ID, date, attending surgeon, location, level of involvement, and Current Procedural Terminology (CPT) code among others for each operation performed. Manual entry is subsequently time consuming and not easily performed on-the-go. While there are mobile apps designed to facilitate logging, the ACGME website does not communicate directly with electronic health records (EHRs) and utilization of such apps is inconsistent.[5] [6] Additionally, while research here is expanding, there are few descriptions of automatic case tracking systems for residents in the literature.[7] [8] [9] [10] [11] Notably, Xiao et al and Kwan et al have described innovative systems to automatically log procedures in the ACGME case log system with EHR data, but these are not widely available or universally compatible.[11] [12] In our system, residents must innovate their own tracking systems, which may be time consuming, delayed, and at risk for incomplete logging of cases performed, as well as potentially subject to data insecurity.
Following and learning from patients that they have cared for is another important administrative task for residents.[13] Surgical residents care for patients in inpatient, outpatient, and ambulatory surgery settings. Depending on the type of encounter, a resident may interact with a patient only once (e.g., an outpatient clinic visit or a consult in the emergency department), and the resident may not be listed in the EHR as the primary provider for patients. It is therefore challenging for residents to track the outcomes of their patients after discharge. However, the educational value of long-term evaluation is substantial. For example, laboratory testing ordered in clinic may not result for several days, potentially preventing the resident from learning the outcome. Additionally, many surgical complications occur or are recognized after discharge from the hospital. Currently, no automatic system exists within the EHR at our institution to help residents track patients they have cared for after discharge.
Finally, it is well-documented that EHRs can increase the amount of time it takes for physicians to accomplish administrative and patient care-related tasks and may even contribute to burnout.[14] [15] [16] [17] Administrative tasks such as those described, if reliant on inefficient EHR workflows, may further contribute to inefficiencies and burnout. On the other hand, a well-designed EHR implementation could improve efficiency and reduce administrative burden.
Objectives
The overarching objective of this study was to build and evaluate an EHR-based tool to help surgical residents accomplish administrative tasks more easily and efficiently. Analytics dashboards have been shown in a variety of health settings to facilitate data presentation and even improve health outcomes.[18] [19] [20] [21] We built an Epic EHR-based user-specific dashboard designed to assist surgical residents in the tracking of bedside procedures, operative cases, and other patients cared for. We hypothesized that a well-integrated and user-friendly tool would experience a high level of use and improve both the efficiency of and satisfaction with administrative tasks. Additionally, while not directly measured in this study, we suggest that such a tool also has the potential improve resident education and reduce EHR-related burnout.
Methods
We conducted a single-institution, EHR-based innovation study incorporating usability testing and pre- and postimplementation assessment. Institutional review board approval was obtained, and all study procedures were conducted in accordance with institutional and international ethical standards. The tool described below was designed and implemented at Vanderbilt University Medical Center in the Epic Hyperspace EHR. It was then tested and evaluated by trainees in the following surgical residency programs: General Surgery, Obstetrics and Gynecology (OBGYN), Neurosurgery, Orthopaedics, and Otolaryngology (ENT). Together, these residency programs include over 175 trainees with surgical experience ranging from 1 to 7 years per resident.
Surgery Resident Dashboard
Surgical residents access the EHR many times daily, and it houses the vast majority of information related to the patients they care for. To maximize the data coverage and accessibility of our tool, we chose to design a user-specific Radar Dashboard within the infrastructure of the Epic EHR. The dashboard (titled the “Surgery Resident Dashboard”) is comprised of Epic Reporting Workbench reports to pull resident-specific patient data from the EHR as well as components designed to improve administrative efficiency (e.g., helpful external links such as the ACGME case logging website). The reports were designed to target the following categories of data: bedside procedures, operative cases, specialty-specific operative cases (e.g., robotic cases, microscopic cases), and notes written by the resident. The last category is intended as a surrogate data source for all patients cared for by residents, including nonoperative patients. To balance technical performance with a useful amount of data, default lookback times were set at 6 months. The data from each report are displayed as a combination of interactive tables and bar graphs, and residents have the ability to manipulate query parameters such as lookback time or the attending physician associated with a case or note ([Fig. 1]).


Five versions of the dashboard were developed: one for each included surgical residency program (General Surgery, OBGYN, Neurosurgery, Orthopaedics, and ENT). While most core components including the operative case and resident notes reports were present in each dashboard, dashboards were tailored to be more specific and/or useful for each residency program. For example, the General Surgery dashboard includes a report for robotic cases, and the OBGYN dashboard includes reports for vaginal deliveries.
Usability Testing
Following initial construction and testing of the dashboard, two rounds of iterative usability testing were performed with volunteer surgical residents. All general surgery residents were invited to participate in testing and the first six volunteers were included (three per round). The dashboard was updated between rounds of testing to improve report performance. Testing interviews consisted of 30-minute meetings with the following components: introduction and brief dashboard demonstration; task completion testing with observation; and feedback. The REDCap data management platform was used to record resident demographics, the results of task completion testing, and feedback.[22] [23] Task completion included 13 specific tasks designed to test basic usability of the dashboard. They were ordered in both a logical user workflow and in increasing complexity and were assessed as “completed independently,” “required minor assistance,” or “required major assistance.” The feedback section included four free-response questions and five relevant questions extracted from the System Usability Scale (SUS).[24] Scores for each question ranged from 0 to 5 (Strongly Disagree to Strongly Agree), and scores were normalized to a 5 being the desired score. Descriptive statistics were performed on usability testing results.
Surveys and Implementation
A preimplementation survey was sent to all residents in each included surgical residency program with the goals of assessing current practices as well as expectations and desires from a new tool. The response rate was 35.4% (completed by 63 of 178 residents). The General Surgery and OBGYN dashboards were implemented first (February 2023), followed by the Neurosurgery, Orthopaedics, and ENT dashboards (October 2023; [Fig. 2]). After implementation of the Surgery Resident Dashboard into the live EHR, workshops demonstrating its use and functionality were held with all residents in each residency program ([Fig. 2]). Following 8-week trial periods, postimplementation surveys were administered to assess resident satisfaction, rates of use, and perceived impact on efficiency, as well as unstructured feedback for ongoing improvement. The response rate for the postimplementation survey was 15.7% (completed by 28 of 178 residents). Access log data from the Epic Clarity database was obtained to assess dashboard usage over time, which was recorded in Clarity as the number of times the dashboard was viewed each 24-hour period. Descriptive statistics were performed on survey results.


Results
Usability Testing
Usability testing was performed in two rounds with a total of six residents ([Table 1]). Residents participating in testing represented a diversity of Epic experience levels, ranging from 1 to 5 years of surgical training (1 first year resident; 1 second year resident; 1 third year resident; 2 fourth year residents; and 1 fifth year resident). There were no technical failures during testing interviews. All six residents were able to successfully complete each of the 13 tasks with no more than minor assistance. A total of 97 and 95% of tasks were completed independently in the first and second rounds of testing, respectively. The remaining tasks were completed with minor assistance. Average responses to the five SUS questions in the first round of testing ranged from 4.33 to 4.67 out of 5, with responses ranging from 4.0 to 4.67 out of 5 in the second round ([Table 1]). Free response questions primarily yielded minor guidance for dashboard improvement. For example, one tester stated: “CPT code column does make scrolling through these to be bulky,” referring to the resident operations report. This lead us to make the viewer output of the report more user friendly. Notably, one recommendation to expand the dashboard to other operative services such as Obstetrics and Gynecology ultimately lead to the development of the specialty specific dashboard versions (OBGYN, Neurosurgery, ENT, and Orthopaedics).
|
Round 1 (N = 3) |
Round 2 (N = 3) |
|
|---|---|---|
|
Dashboard tasks |
Completed independently |
|
|
From the login screen |
||
|
Navigate to the “Surgery Resident Dashboard” and favorite it |
100% |
100% |
|
From the dashboard |
||
|
Navigate to User SmartPhrases[a] |
100% |
100% |
|
Navigate to the ACGME case log website |
100% |
100% |
|
Identify how many OR cases you did in the last 30 d |
100% |
100% |
|
Identify which bedside procedure you performed the most in the last 6 mo |
100% |
100% |
|
Identify how many outpatient progress notes you wrote in the last 6 mo |
100% |
100% |
|
Navigate into one of the three reports |
100% |
67% |
|
From the report |
||
|
Identify a patient of interest and navigate to the patient's chart |
100% |
100% |
|
Filter the report by a specific column |
100% |
100% |
|
Export the result data in Excel format |
100% |
67% |
|
Navigate to the report settings |
100% |
100% |
|
Modify the date parameters for the report |
100% |
100% |
|
Add a criteria for a specific attending |
67% |
100% |
|
Total |
97% |
95% |
|
System Usability Scale |
Mean score |
|
|
I think that I would like to use this system frequently |
4.67 |
4.67 |
|
I thought the system was easy to use |
4.67 |
4.67 |
|
I think that I would need the support of a technical person to be able to use this system |
4.33 |
4.0 |
|
I thought there was too much inconsistency in this system |
4.67 |
4.33 |
|
I found the system very cumbersome to use |
4.33 |
4.33 |
Abbreviations: ACGME, Accreditation Council for Graduate Medical Education; OR, operating room.
Note: System Usability Scale answer choices ranged from “strongly disagree” to “strongly agree” and have been normalized to a maximum positive value of 5.
a Terminology specific to Epic electronic health record.
Preimplementation Survey
Sixty-three residents from across the five included specialties completed the preimplementation survey ([Table 2]). Based on survey results, 44.4% of residents spend less than 30 minutes per month logging cases on average, but that the rest spend up to or greater than an hour. A total of 63.5% of residents report losing track of cases they performed (i.e. never logging) at least one case per month. The most common method of keeping track of nonoperative patients is manual list keeping in the EHR (60.3%). A total of 50% or more residents indicated that each of the following components would be very useful in an EHR-based tool: data on operative cases and nonoperative patients; data on personal complications; and presentation of personal case and clinic schedules. Satisfaction with the current case logging system and perceived perception of support in completing administrative tasks were ranked at median scores of 2.33 and 2.23 out of 5, respectively ([Table 2]). Free response comments at the end of the survey were consistent with several of the eventual components of the dashboard: “I think we as residents do a lot of time-consuming administrative tasks that could be facilitated by the use of Epic, e.g., logging cases, logging duty hours, making weekly schedules;” “Another useful feature [would be] the ability to track patients you don't operate on. I think that's a huge part of the learning, e.g., what happened to that consult that we recommended nonoperative treatment for? Was my overnight interpretation of the imaging correct or overruled in the morning by the day team? I think we can learn a lot from the decisions we make on an isolated shift if we're able to look back at the outcome.”
|
N = 63 |
|
|---|---|
|
Case logging |
% |
|
Time spent logging cases each month |
|
|
0–30 min |
44.4 |
|
31–60 min |
34.9 |
|
1–2 h |
15.9 |
|
>2 h |
4.8 |
|
Frequency of losing track of cases |
|
|
Almost never |
19.0 |
|
A few times per year |
17.5 |
|
1–2 times per month |
33.3 |
|
1–2 times per week |
17.5 |
|
>2 times per week |
12.7 |
|
Patient tracking and dashboard features |
%[a] |
|
Current method of tracking nonoperative patients |
|
|
No method |
34.9 |
|
Manual list in Epic |
60.3 |
|
Paper list or journaling |
7.9 |
|
Electronic chart search |
19.0 |
|
Other (free response) |
3.2 |
|
Important features for a resident dashboard to have |
|
|
Data on operative cases |
90.5 |
|
Data on bedside procedures |
42.9 |
|
Data on nonoperative patients |
55.6 |
|
Data on complications |
58.7 |
|
Personal Epic usage patterns |
36.5 |
|
Links to Epic tools (e.g., SmartPhrases) |
38.1 |
|
Links to other websites (e.g., case log website) |
44.4 |
|
Case and clinic schedules |
65.1 |
|
Other (free response) |
1.6 |
|
Satisfaction questions |
Mean[b] |
|
How satisfied are you with the current system of case logging? |
2.33 |
|
How well supported do you feel in completing administrative tasks? |
2.23 |
a Percentages do not add to 100%.
b Scale: very dissatisfied/unsupported (1) to very satisfied/supported (5).
Postimplementation Survey and Dashboard Usage
Twenty-eight residents from five specialties completed the postimplementation survey ([Table 3]). A total of 46.3% of residents reported using the Surgery Resident Dashboard multiple times per month or more frequently. A total of 53.6% of residents reported using the dashboard only a few times or not at all, with the most common reason for lack of use being that they forgot about it (66.7%). Only 6.7% of these residents did not find the dashboard useful, and 0% experienced technical issues. Operative case tracking was reported to be the most useful feature of the dashboard (75% of participants). Compared with the preimplementation survey, participants reported spending a greater amount of time per month logging cases and similar rates of lost cases ([Tables 2], [3]). However, subjectively participants reported that there has likely been improvement in both of these metrics (mean: 3.41 out of 5 for case logging time; 3.64 out of 5 for frequency of losing cases; range: definitely no improvement [1] to definite improvement [5]). Satisfaction with Surgery Resident Dashboard was a mean of 4.39 out of 5, and there was an increase in overall feeling of support with completing administrative tasks (3.35 out of 5, vs. 2.23 on the preimplementation survey).
|
N = 28 |
|
|---|---|
|
Dashboard usage and features |
% |
|
How often have you used the Surgery Resident Dashboard? |
|
|
Never |
28.6 |
|
A few times |
25.0 |
|
Multiple times per month |
35.7 |
|
Multiple times per week |
10.7 |
|
Reasons for not using the dashboard (if “never” or “a few times”) |
%[a] |
|
Did not know about the dashboard |
13.3 |
|
Forgot about the dashboard |
66.7 |
|
Did not find the dashboard useful |
6.7 |
|
Did not know how to access the dashboard |
6.7 |
|
Technical issues |
0.0 |
|
Other (free response) |
26.7 |
|
What features of the dashboard have been most useful? |
|
|
Bedside procedure tracking |
28.6 |
|
Operative case tracking |
75.0 |
|
Specialty case tracking (e.g., robotic cases, microscope cases, etc.) |
10.7 |
|
Clinical note tracking |
21.4 |
|
General patient tracking |
17.9 |
|
Upcoming cases report |
0.0 |
|
Hyperlinks and Epic resources |
0.0 |
|
Clinic schedule application |
0.0 |
|
Other |
0.0 |
|
Case logging |
% or mean |
|
Time spent logging cases each month |
|
|
0–30 min |
22.2 |
|
31–60 min |
55.6 |
|
1–2 h |
18.5 |
|
>2 h |
3.7 |
|
Has this changed as a result of using the Surgery Resident Dashboard?[b] |
3.41 |
|
Frequency of losing track of cases |
|
|
Almost never |
22.2 |
|
A few times per year |
18.5 |
|
1–2 times per month |
22.2 |
|
1–2 times per week |
18.5 |
|
>2 times per week |
18.5 |
|
Has this changed as a result of using the Surgery Resident Dashboard?[b] |
3.64 |
|
Satisfaction questions |
Mean[c] |
|
Overall, how satisfied are you with the Surgery Resident Dashboard? |
4.39 |
|
How well supported do you feel in completing administrative tasks? |
3.35 |
a Percentages do not add to 100%.
b Scale: definitely no (1) to definitely yes (5).
c Scale: very dissatisfied/unsupported (1) to very satisfied/supported (5).
Free response comments from the postimplementation survey were generally positive and raised opportunities for continued improvement: “It is how I log my cases now and very helpful;” “I will likely eventually use it to check all the cases I have been a part off and make sure I didn't forget to log one;” “Very much like that with operative case reports it has [medical record number] and CPT with each patient without having to click in to the chart.”
Based on Epic access log data, daily use of the Surgery Resident Dashboard has increased over time ([Fig. 2]). The most notable increase in use was observed after the Surgery Resident Dashboard was set to be the default dashboard for surgical residents (July 2023). Spikes in use can also be observed in correlation with the dates of dashboard demonstrations. As of April 2024, daily dashboard use counts are approximately 20 uses per day.
Discussion
Surgical residents perform difficult jobs with multiple intrinsic contributors to stress and burnout. Administrative tasks such as case logging and patient tracking are inevitable and important components of the job, but inefficiencies may contribute to additional stress or simple avoidance when possible. We built an EHR-based “Surgery Resident Dashboard” to aggregate several difficult or inefficient tasks, with the goals of improving efficiency and ultimately resident satisfaction and education. Postimplementation survey results indicate high satisfaction with this tool, which met most needs expressed by the preimplementation survey. Furthermore, we observed an improvement in resident perception of support in completing administrative tasks. These results suggest that even a relatively simple implementation such as this dashboard can have an immediate positive effect on resident satisfaction with necessary administrative tasks. We hypothesize that this improvement can subsequently lead to decreased contribution of such tasks to physician burnout and augmentation of resident education.
Despite the generally positive response to the dashboard indicated by the postimplementation survey and increasing rate of use, the survey also demonstrated that not all intended users of the dashboard were reached or well educated during our implementation. The reasons for this are likely multifactorial. One notable example from the survey demonstrates the possible utility for additional demonstrations and/or dashboard documentation: “My understanding was there was one session to go over it, and I wasn't there and tried to figure it out myself but it wasn't intuitive so I just kept tracking my cases how I normally would. Maybe there needs to be a how to document.” Another barrier to use is the existence of well-trusted manual pathways for case logging, especially by more senior residents (“As a chief resident, I used it to cross reference already logged cases. Other than that I already have a system to log cases;” “I log my cases after each case I do so the Surgery Resident Dashboard has not affected me much”). Based on usability testing and the postimplementation survey, technical issues and complexity were not likely to be major factors in lack of dashboard use. However, the significant jump in dashboard use observed after it was made the default for surgical residents (reducing the number of clicks to access it) reinforce the concept that ease of access improves utility.
A potential limitation to any EHR-based implementation for resident case logging is the relative sensitivity of EHR compared with manual case logging. For example, in our system the circulator nurse in the operating room must mark a resident as present in order for them to be associated with the case in the EHR and therefore for the case to populate in their dashboard. The rate of error here is unknown but certainly greater than zero. However, recent reports assessing automated EHR-based case logging systems suggest that EHR data, while not perfect, has a high level of sensitivity compared with manual case logs, which are subject to human error.[10] [11] Indeed, Kwan et al report that EHR-based case logs are more complete than manual logs in some situations, such as more experienced residents who have already met required case minimums in their case log. An in-depth sensitivity analysis of the dashboard reports, for example, comparing the resident operations report to ACGME case log data, was out of scope for this report but will be the subject of further study.
Another limitation to this study is that it was performed at a single center. However, all build techniques for the Surgery Resident Dashboard utilized standard Epic EHR capabilities, making this tool generalizable to other institutions with the same EHR. Finally, completion rates for the pre- and postimplementation surveys represented 35.4 and 15.7% of the populations for whom the dashboards were implemented. The average response rate for online surveys administered to surgical residents has been reported to be 36.4%, so postimplementation survey results in particular may not completely represent the opinions of the entire cohort.[25]
Future directions for this project include the incorporation of other data that residents indicated would be valuable in surveys, including data on postoperative complications and EHR usage data. The automated tracking of complications represents a larger issue in surgical informatics, which may be amenable to machine learning, natural language processing, and large language model techniques. Another goal is to streamline the implementation of similar dashboards at other academic centers with the Epic EHR, which is technically feasible with some localization. Ultimately, the authors hope to see some administrative tasks such as case logging to become almost entirely automated. In the meantime, EHR-integrated tools such as the Surgery Resident Dashboard have the potential to improve resident job satisfaction and well-being.
Conclusion
Current systems that surgical residents use for the completion of administrative tasks such as operative case logging are frequently manual and inefficient. An EHR-based tool utilizing existing capabilities has the potential to reduce the burden of administrative tasks and improve efficiency. Barriers to implementation and the sensitivity of such tools warrant further investigation.
Clinical Relevance Statement
This work has direct relevance to patient care in that the developed tool, the Surgery Resident Dashboard, is actively used by surgical trainees to keep track of patients they care for and to optimize their EHR. Indirectly, the project also supports patient care by supporting resident education and wellness.
Multiple-Choice Questions
-
Which of the following data reports are currently included in the Surgery Resident Dashboard?
-
Resident operations
-
Resident complications
-
EHR efficiency data
-
Resident work hours
Correct Answer: The correct answer is option a. Of these answer choices, currently only reports for resident operations are included in the Dashboard, along with reports for bedside procedures, clinical notes, vaginal deliveries (for OBGYN residents), robotic procedures, and microscope procedures. Presenting data on operative complications is a future goal for this study. EHR efficiency data for physicians is reported by Epic through their signal application.
-
-
In the postimplementation survey, which dashboard feature was reported by residents to be the most useful?
-
Bedside procedure tracking
-
Operative case tracking
-
Clinical note tracking
-
Specialty case tracking
Correct Answer: The correct answer is option b. A total of 75.0% of survey respondents reported that operative case tracking was one of the most useful features of the dashboard. While some survey respondents did report the other listed features as being the most useful, these percentages were much smaller.
-
Conflict of Interest
None declared.
Acknowledgments
The authors would like to acknowledge the National Library of Medicine T15 training grant in biomedical informatics (T15 LM007450) and the surgical residency training programs and residents at Vanderbilt University Medical Center for their participation in this project.
Protection of Human and Animal Subjects
The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects and was reviewed by the Vanderbilt Institutional Review Board.
-
References
- 1 Lin JA, Pierce L, Murray SG. et al. Estimation of surgical resident duty hours and workload in real time using electronic health record data. J Surg Educ 2021; 78 (06) e232-e238
- 2 Bennett CL, McDonald DA, Chang Y. et al. A national cross-sectional study of surgery residents who underreport duty hours. J Surg Educ 2017; 74 (06) 928-933
- 3 Rodrigues H, Cobucci R, Oliveira A. et al. Burnout syndrome among medical residents: a systematic review and meta-analysis. PLoS One 2018; 13 (11) e0206840
- 4 Pulcrano M, Evans SR, Sosin M. Quality of life and burnout rates across surgical specialties: a systematic review. JAMA Surg 2016; 151 (10) 970-978
- 5 George BC, Bohnen JD, Schuller MC, Fryer JP. Using smartphones for trainee performance assessment: a SIMPL case study. Surgery 2020; 167 (06) 903-906
- 6 Eaton M, Scully R, Schuller M. et al. Value and barriers to use of the SIMPL tool for resident feedback. J Surg Educ 2019; 76 (03) 620-627
- 7 Bhattacharya P, Van Stavern R, Madhavan R. Automated data mining: an innovative and efficient web-based approach to maintaining resident case logs. J Grad Med Educ 2010; 2 (04) 566-570
- 8 Chen P-H, Chen YJ, Cook TS. Capricorn-a web-based automatic case log and volume analytics for diagnostic radiology residents. Acad Radiol 2015; 22 (10) 1242-1251
- 9 Bachur RG, Nagler J. Use of an automated electronic case log to assess fellowship training: tracking the pediatric emergency medicine experience. Pediatr Emerg Care 2008; 24 (02) 75-82
- 10 Smith M, Layng T. A comparison of resident procedure logs to data generated from an electronic health record. Appl Clin Inform Open 2023; 7: e87-e90
- 11 Kwan B, Engel J, Steele B. et al. An automated system for physician trainee procedure logging via electronic health records. JAMA Netw Open 2024; 7 (01) e2352370
- 12 Xiao G, Sikder S, Woreta F, Boland MV. Implementation and evaluation of integrating an electronic health record with the ACGME case log system. J Grad Med Educ 2022; 14 (04) 482-487
- 13 Rudolf F, Oyama LC, El-Kareh R. Impact of an automated patient outcome feedback system on emergency medicine resident patient follow-up: an interrupted time series analysis. AEM Educ Train 2024; 8 (04) e11011
- 14 Johnson KB, Neuss MJ, Detmer DE. Electronic health records and clinician burnout: a story of three eras. J Am Med Inform Assoc 2021; 28 (05) 967-973
- 15 Yan Q, Jiang Z, Harbin Z, Tolbert PH, Davies MG. Exploring the relationship between electronic health records and provider burnout: a systematic review. J Am Med Inform Assoc 2021; 28 (05) 1009-1021
- 16 Baxter SL, Saseendrakumar BR, Cheung M. et al. Association of electronic health record inbasket message characteristics with physician burnout. JAMA Netw Open 2022; 5 (11) e2244363
- 17 Tai-Seale M, Baxter S, Millen M. et al. Association of physician burnout with perceived EHR work stress and potentially actionable factors. J Am Med Inform Assoc 2023; 30 (10) 1665-1672
- 18 Nelson O, Sturgis B, Gilbert K. et al. A visual analytics dashboard to summarize serial anesthesia records in pediatric radiation treatment. Appl Clin Inform 2019; 10 (04) 563-569
- 19 Safranek CW, Feitzinger L, Joyner AKC. et al. Visualizing opioid-use variation in a pediatric perioperative dashboard. Appl Clin Inform 2022; 13 (02) 370-379
- 20 Hysong SJ, Yang C, Wong J, Knox MK, O'Mahen P, Petersen LA. Beyond information design: designing health care dashboards for evidence-driven decision-making. Appl Clin Inform 2023; 14 (03) 465-469
- 21 Jonnalagadda P, Swoboda C, Singh P. et al. Developing dashboards to address children's health disparities in Ohio. Appl Clin Inform 2022; 13 (01) 100-112
- 22 Harris PA, Taylor R, Minor BL. et al; REDCap Consortium. The REDCap consortium: building an international community of software platform partners. J Biomed Inform 2019; 95: 103208
- 23 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42 (02) 377-381
- 24 Brooke J. SUS: a 'quick and dirty' usability scale. In: Jordan P, McClelland I, Weerdmeester B. eds. Usability Evaluation in Industry. London: Taylor & Francis; 1996: 189-194
- 25 Yarger JB, James TA, Ashikaga T. et al. Characteristics in response rates for surveys administered to surgery residents. Surgery 2013; 154 (01) 38-45
Address for correspondence
Publication History
Received: 01 July 2024
Accepted: 16 October 2024
Accepted Manuscript online:
17 October 2024
Article published online:
26 February 2025
© 2025. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Lin JA, Pierce L, Murray SG. et al. Estimation of surgical resident duty hours and workload in real time using electronic health record data. J Surg Educ 2021; 78 (06) e232-e238
- 2 Bennett CL, McDonald DA, Chang Y. et al. A national cross-sectional study of surgery residents who underreport duty hours. J Surg Educ 2017; 74 (06) 928-933
- 3 Rodrigues H, Cobucci R, Oliveira A. et al. Burnout syndrome among medical residents: a systematic review and meta-analysis. PLoS One 2018; 13 (11) e0206840
- 4 Pulcrano M, Evans SR, Sosin M. Quality of life and burnout rates across surgical specialties: a systematic review. JAMA Surg 2016; 151 (10) 970-978
- 5 George BC, Bohnen JD, Schuller MC, Fryer JP. Using smartphones for trainee performance assessment: a SIMPL case study. Surgery 2020; 167 (06) 903-906
- 6 Eaton M, Scully R, Schuller M. et al. Value and barriers to use of the SIMPL tool for resident feedback. J Surg Educ 2019; 76 (03) 620-627
- 7 Bhattacharya P, Van Stavern R, Madhavan R. Automated data mining: an innovative and efficient web-based approach to maintaining resident case logs. J Grad Med Educ 2010; 2 (04) 566-570
- 8 Chen P-H, Chen YJ, Cook TS. Capricorn-a web-based automatic case log and volume analytics for diagnostic radiology residents. Acad Radiol 2015; 22 (10) 1242-1251
- 9 Bachur RG, Nagler J. Use of an automated electronic case log to assess fellowship training: tracking the pediatric emergency medicine experience. Pediatr Emerg Care 2008; 24 (02) 75-82
- 10 Smith M, Layng T. A comparison of resident procedure logs to data generated from an electronic health record. Appl Clin Inform Open 2023; 7: e87-e90
- 11 Kwan B, Engel J, Steele B. et al. An automated system for physician trainee procedure logging via electronic health records. JAMA Netw Open 2024; 7 (01) e2352370
- 12 Xiao G, Sikder S, Woreta F, Boland MV. Implementation and evaluation of integrating an electronic health record with the ACGME case log system. J Grad Med Educ 2022; 14 (04) 482-487
- 13 Rudolf F, Oyama LC, El-Kareh R. Impact of an automated patient outcome feedback system on emergency medicine resident patient follow-up: an interrupted time series analysis. AEM Educ Train 2024; 8 (04) e11011
- 14 Johnson KB, Neuss MJ, Detmer DE. Electronic health records and clinician burnout: a story of three eras. J Am Med Inform Assoc 2021; 28 (05) 967-973
- 15 Yan Q, Jiang Z, Harbin Z, Tolbert PH, Davies MG. Exploring the relationship between electronic health records and provider burnout: a systematic review. J Am Med Inform Assoc 2021; 28 (05) 1009-1021
- 16 Baxter SL, Saseendrakumar BR, Cheung M. et al. Association of electronic health record inbasket message characteristics with physician burnout. JAMA Netw Open 2022; 5 (11) e2244363
- 17 Tai-Seale M, Baxter S, Millen M. et al. Association of physician burnout with perceived EHR work stress and potentially actionable factors. J Am Med Inform Assoc 2023; 30 (10) 1665-1672
- 18 Nelson O, Sturgis B, Gilbert K. et al. A visual analytics dashboard to summarize serial anesthesia records in pediatric radiation treatment. Appl Clin Inform 2019; 10 (04) 563-569
- 19 Safranek CW, Feitzinger L, Joyner AKC. et al. Visualizing opioid-use variation in a pediatric perioperative dashboard. Appl Clin Inform 2022; 13 (02) 370-379
- 20 Hysong SJ, Yang C, Wong J, Knox MK, O'Mahen P, Petersen LA. Beyond information design: designing health care dashboards for evidence-driven decision-making. Appl Clin Inform 2023; 14 (03) 465-469
- 21 Jonnalagadda P, Swoboda C, Singh P. et al. Developing dashboards to address children's health disparities in Ohio. Appl Clin Inform 2022; 13 (01) 100-112
- 22 Harris PA, Taylor R, Minor BL. et al; REDCap Consortium. The REDCap consortium: building an international community of software platform partners. J Biomed Inform 2019; 95: 103208
- 23 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42 (02) 377-381
- 24 Brooke J. SUS: a 'quick and dirty' usability scale. In: Jordan P, McClelland I, Weerdmeester B. eds. Usability Evaluation in Industry. London: Taylor & Francis; 1996: 189-194
- 25 Yarger JB, James TA, Ashikaga T. et al. Characteristics in response rates for surveys administered to surgery residents. Surgery 2013; 154 (01) 38-45




