CC BY-NC-ND 4.0 · Ultrasound Int Open 2022; 08(01): E2-E6
DOI: 10.1055/a-1795-5138
Original Article

Hands-On Time in Simulation-Based Ultrasound Training – A Dose-Related Response Study

Oria Mahmood
1   Copenhagen Academy for Medical Education and Simulation, University of Copenhagen and The Capital Region of Denmark, Copenhagen, Denmark
2   Department of Anaesthesiology and Intensive Care, Holbaek hospital, Holbaek, Denmark
,
Rikke Jeong Jørgensen
1   Copenhagen Academy for Medical Education and Simulation, University of Copenhagen and The Capital Region of Denmark, Copenhagen, Denmark
,
Kristina Rue Nielsen
1   Copenhagen Academy for Medical Education and Simulation, University of Copenhagen and The Capital Region of Denmark, Copenhagen, Denmark
3   Department of Radiology, Copenhagen University Hospital, Rigshospitalet, Copenhagen, Denmark
,
Lars Konge
1   Copenhagen Academy for Medical Education and Simulation, University of Copenhagen and The Capital Region of Denmark, Copenhagen, Denmark
,
Lene Russell
1   Copenhagen Academy for Medical Education and Simulation, University of Copenhagen and The Capital Region of Denmark, Copenhagen, Denmark
4   Department of Intensive Care, Copenhagen University Hospital, Rigshospitalet, Copenhagen, Denmark
› Author Affiliations
 

Abstract

Purpose Point of care ultrasound (POCUS) is widely used, but the sensitivity and specificity of the findings are highly user-dependent. There are many different approaches to ultrasound training. The aim of this study was to explore the effects of hands-on practice when learning POCUS.

Methods Junior doctors with no or limited ultrasound experience were included in the study and divided into three groups. They all completed a Focused Assessment with Sonography for Trauma (FAST) course with different amounts of hands-on practice: 40 minutes (n=67), 60 minutes (n=12), and 90 minutes of hands-on time (n=27). By the end of the course, they all completed a previously validated test.

Results More hands-on time improved the mean test scores and decreased the test time. The scores of the 40-, 60-, and 90-minute groups were 11.6 (SD 2.1), 12.8 (SD 2.5), and 13.7 (SD 2.5), respectively (p<0.001). The 90-minute group completed the test significantly faster than the other two groups (20 versus 26 minutes, p=0.003). A large inter-individual variation was seen.

Conclusion The necessary amount of hands-on training is unknown. This study demonstrates that performance increases with prolonged hands-on time but the inter-individual variation among trainees is very large, thereby making it impossible to define the “optimal” time. This supports the use of the concept of mastery learning where each individual trainee can continue training until proficiency is reached.


#

Introduction

The use of point of care ultrasound (POCUS) or bedside ultrasound has expanded across many medical and surgical specialties [1] and is viewed as an essential skill for the new generation of physicians [2]. However, the sensitivity and specificity of ultrasound findings are highly operator-dependent [1] [3] and exposure to ultrasound during clinical training may give a false sense of competence [4]. This could potentially put patients at risk. In fact, the Emergency Care Research Institute (ECRI) identifies incorrect usage of ultrasound as one of the top ten health technology hazards. ECRI therefore recommends that protocols for the training and examination of ultrasound users should follow established guidelines and recommendations [5].

One of the most common POCUS protocols is the Focused Assessment with Sonography for Trauma (FAST), which has become an established tool in trauma management [6]. The FAST examination consists of four simple standardized sonographic views: pericardial, perihepatic, perisplenic, and pelvic. The purpose is to identify the presence of intraperitoneal or pericardial free fluid (which in the trauma setting most often would be due to bleeding). Previous studies have reported steep learning curves when learning POCUS including FAST [7] [8].

Ultrasound training curricula are typically based on either performing a certain number of scans or training for a pre-specified amount of time [9] [10] [11]. However, the amount of hands-on training necessary to ensure competency to perform a certain POCUS protocol is not known. Therefore, we designed this study with the aim of further exploring the amount of hands-on practice necessary to learn a specific POCUS protocol - the FAST exam.


#

Materials and Method

Junior doctors (<12 months of experience as medical doctors) completing their first postgraduate year in different departments were included in this study. They all had no or very limited ultrasound experience (no experience with unsupervised ultrasound scans; none had performed more than five scans with supervision). They all attended an ultrasound course in a controlled environment at a University Hospital Simulation Center. The course consisted of a theoretical lecture followed by hands-on practice on two different virtual reality simulators and up to three healthy volunteers. All trainees performed the same examinations on the same two simulators. All of the training sessions were conducted by the same team, which included two specialist doctors (one anesthesiologist and one radiologist) with many years of clinical ultrasound experience (including the FAST protocol). The hands-on training was performed under close supervision.

Before starting the course, the junior doctors were separated into three groups. They all completed the same course except with different amounts of time of hands-on practice: the first group had 40 minutes, the second group had 60 minutes, and the last group had 90 minutes of hands-on practice ([Table 1]). During the practice, the trainees performed several supervised FAST examinations on healthy volunteers followed by training on two different virtual reality simulators: Schallware (Ultrasound simulator station-128) and the Simbionix U/S mentor (3D systems). Both simulators can be used for diagnostic abdominal ultrasound training, the main difference being that the Schallware simulator uses recorded films of real patient scans whereas the Simbionix simulator uses computer-generated illustrations [12]. The simulators allow trainees to perform actual real-time dynamic FAST examinations with positive findings (i. e., free fluid) on lifelike mannequins with a mock ultrasound probe.

Table 1 Baseline and course characteristics.

40-minute group

60-minute group

90-minute group

Number

67

12

27

Age (years)

28 (26–33)

28 (26–33)

28.3 (25–33)

Female (%)

43 (64%)

8 (67%)

17 (74%)

Course details:

Theoretical introduction (minutes)

10

20

20

Hands-on time volunteers (minutes)

20

30

45

Hands-on time simulators (minutes)

20

30

45

Data are expressed as number (percentages) or median (interquartile range for age the range). All participants had < 12 months of experience as medical doctors and had no experience with unsupervised ultrasound scans; none had performed more than 5 scans with supervision.

The test

A validated test in FAST [13] was used to compare the performance results between the three groups. The tests were performed on the Schallware ultrasound simulator and consisted of five consecutive complete FAST examinations, i. e., 20 different ultrasound views giving a maximum of 20 points, with a score of 14 points being required to pass the test (previously established pass/fail score). The novices had a maximum of six minutes to complete each FAST examination in the test.


#

Data analysis

The test scores and times of the three groups were compared using Analysis of Variance (ANOVA) followed by pair-wise comparisons using independent samples t-tests. Levene’s test was used to compare the variances in the three groups. Differences were considered statistically significant when P<0.05. The statistical analyses were done using IBM SPSS statistics software (IBM Corp. Released 2011, version 20.0). We used GraphPad Prism 6.00 for OS (USA) to create the graphs.

Ethical considerations

All participants were given verbal and written information about the study (and all signed an informed consent form). None of the junior doctors included in the study were working at the same institution as the authors. The collected data was anonymized. The study was exempt from ethical approval according to Danish legislation.


#
#
#

Results

In total, 106 junior doctors ([Table 1]) attended the course and performed the test. The mean test score increased with a longer hands-on training time: 11.6 (SD 2.1) in the 40-minute group, 12.8 (SD 2.5) in the 60-minute group, and 13.7 (SD 2.5) in the 90-minute group (p<0.001) ([Table 2]).

Table 2 Test performance after participation in course *.

40-minute group

60-minute group

90-minute group

Mean score

11.6 (2.1)

12.8 (2.5)

13.7 (2.5)

Median score

12 (10–13)

12.5 (10–16)

13 (12–15)

Range

7–17

9–16

10–19

Mean time (min)

26.3 (5.5)

25.9 (6.6)

20.3 (4.3)

Mean score/minute (1/min)

0.46 (0.15)

0.5 (0.19)

0.7 (0.21)

Passed test*

12 (18%)

4 (33%)

11 (41%)

Data are expressed as mean (standard deviation), median (interquartile range), or number (percentages). *Participants all performed a validated test with a maximum score of 20 and with a pass/fail score of 14 (15).

There were no significant differences between the 40-minute group and the 60-minute group (p=0.09) or between the 60-minute group and the 90-minute group (p=0.26). However, there was a significant difference in mean test score between the 40-minute group and the 90-minute group (p<0.001). In the 40-minute group, 19% of the junior doctors passed the test, in the 60-minute group, 33% passed, and in the 90-minute group, 41% passed ([Table 2]), p=0.05. Despite a higher mean score and a higher pass percentage in the group with more hands-on time, the interindividual variation was very large in all three groups ([Fig. 1]).

Zoom Image
Fig. 1 Scatter plot for the three groups. The dots represent every novice score within the groups. The thickened line marks the mean test score for each group.

As seen in [Table 2], the total time the participants spent on performing the test was lower in the 90-minute group compared to the other two groups (20 versus 26 minutes, p=0.003). Accordingly, the 90-minute group had a significantly higher test score/minute (0.7 versus 0.5, p<0.001), but as can be seen in [Fig. 2], the variation within the group was large.

Zoom Image
Fig. 2 Test score per minute in the three groups with different hands-on time. The central bar in the box represents the median score per minute, the box represents the interquatile range, and the whiskers represent the range.

#

Discussion

This study showed that more hands-on time led to a higher mean test score, i. e., an increase in hands-on training led to better overall performance in a shorter amount of time. This is no surprise. However, the important point demonstrated by this study is that even though the mean test score increases among the doctors when increasing hands-on training time, the interindividual variation is very large ([Fig. 1] and [Fig. 2]). It would be expected that the intraindividual variation is substantial for trainees in the first stages of their training as proposed by Fitts and Posner’s model of skills acquisition, where performance is characterized by three sequential stages: 1) cognitive; during this stage the trainee develops a mental picture and fuller understanding of the required action, 2) associative; during this stage the trainee physically practices the action and 3) autonomous; during this stage the trainee learns to carry out the skill with little conscious effort [14]. Initially there is rapid improvement in performance, followed by a more gradual slower phase. The speed with which the individual learner passes through these phases will vary greatly. As shown in this study, time itself does not ensure proficiency, but it does improve performance in general.

In general, training in the medical field is expensive and learning programs should be both competence-generating and cost-effective [15]. Furthermore, insufficient training, especially in user-dependent modalities such as ultrasound, can be a hazard. The best way to ensure that trainees achieve acceptable levels of performance and diagnostic accuracy remains controversial [7] [8] [9] [16]. Several factors have been shown to facilitate the learning of motor skills, such as observational learning, external focus of attention, feedback, and self-controlled practice. There are a variety of reasons why these variables are effective. However, no single factor has been shown to be superior [10] [11]. It is relatively clear that both observing and physically practicing a task are necessary to learn it [17]. The efficacy of skills training using simulation has been well-documented [18] and studies also show sustained effect of simulation-based training on clinical performance [19]. The main take-home message of this study is that practice time itself cannot be used as a measure of competence when learning point of care ultrasound. This finding aligns with a learning curve study by Gustafsson et al., which found that the training time to reach plateau varied widely for 38 orthopedic surgery trainees practicing hip fracture surgery on a simulator with an average of 169 minutes with a 95% confidence interval (152–187 minutes) [20].

Similarly, a fixed number of performed procedures does not ensure proficiency as illustrated by Barsuk et al. when comparing resident physicians baseline simulated clinical skills (central venous catheter insertion, lumbar puncture, paracentesis, and thoracentesis) to their self-reported procedure experience [21]. However, the European Federation of Societies for Ultrasound in Medicine and Biology (EFSUMB) still propose a minimum number of scans as a training requirement [22].

Mastery learning is a break from previous training of physicians where acquisition and maintenance of clinical competence were based on clinical experience alone [23]. Mastery learning is an approach to competency-based education where trainees acquire knowledge and skills with fixed achievement standards, without limiting the time needed to reach proficiency. Importantly mastery learning results have little or no variation, whereas educational time can vary among learners.

The concept of mastery learning requires a validated test with a credible pass/fail standard to assess competence [13]. It increases professional self-efficacy and translates into improved patient care practices and patient safety outcomes [24]. This study supports the concept that mastery learning is the optimal method to ensure competence, also when learning ultrasound. Newer guidelines on endoscopic ultrasound support this approach by recommending using validated assessment tools to ensure training is continued until a predefined level of competence is achieved. No arbitrary number of training procedures is mentioned [25].

As for the FAST protocol, as with other POCUS areas, there is currently no standardization of training and different models are used, including simulation training and live patient-based training.

Each model has its own advantages and disadvantages [26] but many ultrasound courses are still arranged within a fixed timeframe, e. g., one day, or a fixed number of scans. Different trainees learn at different paces and a certain amount of time or a prespecified number of scans does not guarantee that sufficient competency has been achieved. It is therefore crucial to insist on mastery learning.

This is very clear from the results of this study. Despite increasing test results in the group with the longest hands-on time, the fraction of trainees actually passing the test was quite low. Just as many in the other point of care ultrasound examinations, the result of a FAST examination could potentially have a great impact on the treatment of the patient. Therefore, the test itself was designed to discriminate between FAST novices and experienced users [12]. Since the trainees in this study did not have any ultrasound experience before the course, this clearly demonstrates that one short course, despite increasing amount of hands-on time, is not enough to learn how to use ultrasound in patient management including interpretation of clinical findings, since this requires more extensive training.

The most important limitation of this study is the different group sizes. However, medical education research study sample sizes are often small, so the large number of participants could arguably make up for the unbalanced group sizes [27]. Another limitation is the lack of randomization, although the novices in the three groups are all very similar. They were all recruited from the same sites, they all lacked ultrasound experience, and they were all within their first year of post-graduate training. All training was conducted by the same team, and they all performed the same standardized test.


#

Conclusion

This study demonstrates that scanning performance when learning ultrasound increases with prolonged hands-on time but the interindividual variation among trainees is very large, thereby making it impossible to define the “optimal” time for hands-on training. This supports the use of the concept of mastery learning where each individual trainee must continue training until proficiency is reached.


#

Notice

This article was changed according to the following Erratum on July 6th 2022.


#

Erratum

In the above-mentioned article, the affiliations were indicated incorrectly, the corresponding author was changed, and on p. E5, one sentence was incorrect.


#
#

Conflict of Interest

The authors declare that they have no conflict of interest

  • References

  • 1 Moore CL, Copel JA. Point-of-care ultrasonography. N Engl J Med 2011; 364: 749-757
  • 2 Atkinson P, Bowra J, Lambert M, Noble V, Jarman B. International Federation for Emergency Medicine point of care ultrasound curriculum. CJEM. 2015; 17: 161-170
  • 3 Tolsgaard MG, Rasmussen MB, Tappert C. et al. Which factors are associated with trainees’ confidence in performing obstetric and gynecological ultrasound examinations?. Ultrasound Obstet Gynecol 2014; 43: 444-451
  • 4 Tripu R, Lauerman MH, Haase D. et al. Graduating Surgical Residents Lack Competence in Critical Care Ultrasound. J Surg Educ 2018; 75: 582-588
  • 5 ECRI organiastion. Top 10 Health technology Hazards for 2020. Executive Brief: ecri; 2020 https://www.ecri.org/landing-2020-top-ten-health-technology-hazards
  • 6 Boulanger BR, Kearney PA, Brenneman FD, Tsuei B, Ochoa J. Utilization of FAST (Focused Assessment with Sonography for Trauma) in 1999: results of a survey of North American trauma centers. Am Surg 2000; 66: 1049-1055
  • 7 Shackford SR, Rogers FB, Osler TM, Trabulsy ME, Clauss DW, Vane DW. Focused abdominal sonogram for trauma: the learning curve of nonradiologist clinicians in detecting hemoperitoneum. J Trauma 1999; 46: 553-562 discussion 62-4
  • 8 Ma OJ, Gaddis G, Norvell JG, Subramanian S. How fast is the focused assessment with sonography for trauma examination learning curve?. Emerg Med Australas 2008; 20: 32-37
  • 9 Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging 2013; 94: 1109-1115
  • 10 Wulf G, Shea C, Lewthwaite R. Motor skill learning and performance: a review of influential factors. Med Educ 2010; 44: 75-84
  • 11 Mackay FD, Zhou F, Lewis D, Fraser J, Atkinson PR. Can You Teach Yourself Point-of-care Ultrasound to a Level of Clinical Competency? Evaluation of a Self-directed Simulation-based Training Program. Cureus. 2018; 10: e3320
  • 12 Ostergaard ML, Konge L, Kahr N. et al. Four Virtual-Reality Simulators for Diagnostic Abdominal Ultrasound Training in Radiology. Diagnostics (Basel) 2019; 9: 50 DOI: 10.3390/diagnostics9020050.
  • 13 Russell L, Ostergaard ML, Nielsen MB, Konge L, Nielsen KR. Standardised assessment of competence in Focused Assessment with Sonography for Trauma. Acta Anaesthesiol Scand 2018; 68: 1154-1160
  • 14 Taylor JA, Ivry RB. The role of strategies in motor learning. Ann N Y Acad Sci 2012; 1251: 1-12
  • 15 Tolsgaard MG, Tabor A, Madsen ME. et al. Linking quality of care and training costs: cost-effectiveness in health professions education. Med Educ 2015; 49: 1263-1271
  • 16 Jang T, Sineff S, Naunheim R, Aubin C. Residents should not independently perform focused abdominal sonography for trauma after 10 training examinations. J Ultrasound Med 2004; 23: 793-797
  • 17 Granados C, Wulf G. Enhancing motor learning through dyad practice: contributions of observation and dialogue. Res Q Exerc Sport 2007; 78: 197-203
  • 18 McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010; 44: 50-63
  • 19 Tolsgaard MG, Ringsted C, Dreisler E. et al. Sustained effect of simulation-based ultrasound training on clinical performance: a randomized trial. Ultrasound Obstet Gynecol 2015; 46: 312-318
  • 20 Gustafsson A, Pedersen P, Rømer TB, Viberg B, Palm H, Konge L. Hip-fracture osteosynthesis training: exploring learning curves and setting proficiency standards. Acta Orthop 2019; 90: 348-353
  • 21 Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ 2017; 9: 201-208
  • 22 Biology EFOSFUIMA. Minimum training requirements for the practice of medical ultrasound in Europe. 2009;Appendix 1.
  • 23 Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med 2013; 88: 1178-1186
  • 24 Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63-S67
  • 25 Vilmann P, Clementsen PF, Colella S. et al. Combined endobronchial and esophageal endosonography for the diagnosis and staging of lung cancer: European Society of Gastrointestinal Endoscopy (ESGE) Guideline, in cooperation with the European Respiratory Society (ERS) and the European Society of Thoracic Surgeons (ESTS). Endoscopy 2015; 47: c1
  • 26 Salen PN, Melanson SW, Heller MB. The focused abdominal sonography for trauma (FAST) examination: considerations and recommendations for training physicians in the use of a new clinical tool. Acad Emerg Med 2000; 7: 162-168
  • 27 Cook DA, Hatala R. Got power? A systematic review of sample size adequacy in health professions education research. Adv Health Sci Educ Theory Pract 2015; 20: 73-83

Correspondence

Lene Russell, MD, PhD
Dept. of Intensive Care, Copenhagen University hospital,
Rigshospitalet & Copenhagen Academy for Medical
Education and Simulation
University of Copenhagen and
The Capital Region of Denmark   
Copenhagen
Denmark   

Publication History

Received: 16 March 2021

Accepted after revision: 23 February 2022

Article published online:
03 May 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial-License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Moore CL, Copel JA. Point-of-care ultrasonography. N Engl J Med 2011; 364: 749-757
  • 2 Atkinson P, Bowra J, Lambert M, Noble V, Jarman B. International Federation for Emergency Medicine point of care ultrasound curriculum. CJEM. 2015; 17: 161-170
  • 3 Tolsgaard MG, Rasmussen MB, Tappert C. et al. Which factors are associated with trainees’ confidence in performing obstetric and gynecological ultrasound examinations?. Ultrasound Obstet Gynecol 2014; 43: 444-451
  • 4 Tripu R, Lauerman MH, Haase D. et al. Graduating Surgical Residents Lack Competence in Critical Care Ultrasound. J Surg Educ 2018; 75: 582-588
  • 5 ECRI organiastion. Top 10 Health technology Hazards for 2020. Executive Brief: ecri; 2020 https://www.ecri.org/landing-2020-top-ten-health-technology-hazards
  • 6 Boulanger BR, Kearney PA, Brenneman FD, Tsuei B, Ochoa J. Utilization of FAST (Focused Assessment with Sonography for Trauma) in 1999: results of a survey of North American trauma centers. Am Surg 2000; 66: 1049-1055
  • 7 Shackford SR, Rogers FB, Osler TM, Trabulsy ME, Clauss DW, Vane DW. Focused abdominal sonogram for trauma: the learning curve of nonradiologist clinicians in detecting hemoperitoneum. J Trauma 1999; 46: 553-562 discussion 62-4
  • 8 Ma OJ, Gaddis G, Norvell JG, Subramanian S. How fast is the focused assessment with sonography for trauma examination learning curve?. Emerg Med Australas 2008; 20: 32-37
  • 9 Cazes N, Desmots F, Geffroy Y, Renard A, Leyral J, Chaumoitre K. Emergency ultrasound: a prospective study on sufficient adequate training for military doctors. Diagn Interv Imaging 2013; 94: 1109-1115
  • 10 Wulf G, Shea C, Lewthwaite R. Motor skill learning and performance: a review of influential factors. Med Educ 2010; 44: 75-84
  • 11 Mackay FD, Zhou F, Lewis D, Fraser J, Atkinson PR. Can You Teach Yourself Point-of-care Ultrasound to a Level of Clinical Competency? Evaluation of a Self-directed Simulation-based Training Program. Cureus. 2018; 10: e3320
  • 12 Ostergaard ML, Konge L, Kahr N. et al. Four Virtual-Reality Simulators for Diagnostic Abdominal Ultrasound Training in Radiology. Diagnostics (Basel) 2019; 9: 50 DOI: 10.3390/diagnostics9020050.
  • 13 Russell L, Ostergaard ML, Nielsen MB, Konge L, Nielsen KR. Standardised assessment of competence in Focused Assessment with Sonography for Trauma. Acta Anaesthesiol Scand 2018; 68: 1154-1160
  • 14 Taylor JA, Ivry RB. The role of strategies in motor learning. Ann N Y Acad Sci 2012; 1251: 1-12
  • 15 Tolsgaard MG, Tabor A, Madsen ME. et al. Linking quality of care and training costs: cost-effectiveness in health professions education. Med Educ 2015; 49: 1263-1271
  • 16 Jang T, Sineff S, Naunheim R, Aubin C. Residents should not independently perform focused abdominal sonography for trauma after 10 training examinations. J Ultrasound Med 2004; 23: 793-797
  • 17 Granados C, Wulf G. Enhancing motor learning through dyad practice: contributions of observation and dialogue. Res Q Exerc Sport 2007; 78: 197-203
  • 18 McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ 2010; 44: 50-63
  • 19 Tolsgaard MG, Ringsted C, Dreisler E. et al. Sustained effect of simulation-based ultrasound training on clinical performance: a randomized trial. Ultrasound Obstet Gynecol 2015; 46: 312-318
  • 20 Gustafsson A, Pedersen P, Rømer TB, Viberg B, Palm H, Konge L. Hip-fracture osteosynthesis training: exploring learning curves and setting proficiency standards. Acta Orthop 2019; 90: 348-353
  • 21 Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ Procedural Experience Does Not Ensure Competence: A Research Synthesis. J Grad Med Educ 2017; 9: 201-208
  • 22 Biology EFOSFUIMA. Minimum training requirements for the practice of medical ultrasound in Europe. 2009;Appendix 1.
  • 23 Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med 2013; 88: 1178-1186
  • 24 Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63-S67
  • 25 Vilmann P, Clementsen PF, Colella S. et al. Combined endobronchial and esophageal endosonography for the diagnosis and staging of lung cancer: European Society of Gastrointestinal Endoscopy (ESGE) Guideline, in cooperation with the European Respiratory Society (ERS) and the European Society of Thoracic Surgeons (ESTS). Endoscopy 2015; 47: c1
  • 26 Salen PN, Melanson SW, Heller MB. The focused abdominal sonography for trauma (FAST) examination: considerations and recommendations for training physicians in the use of a new clinical tool. Acad Emerg Med 2000; 7: 162-168
  • 27 Cook DA, Hatala R. Got power? A systematic review of sample size adequacy in health professions education research. Adv Health Sci Educ Theory Pract 2015; 20: 73-83

Zoom Image
Fig. 1 Scatter plot for the three groups. The dots represent every novice score within the groups. The thickened line marks the mean test score for each group.
Zoom Image
Fig. 2 Test score per minute in the three groups with different hands-on time. The central bar in the box represents the median score per minute, the box represents the interquatile range, and the whiskers represent the range.