CC BY-NC-ND 4.0 · Endosc Int Open 2022; 10(06): E815-E823
DOI: 10.1055/a-1814-9747
Original article

Assessment of esophagogastroduodenoscopy skills on simulators before real-life performance

Anders Bo Nielsen
1   Odense University Hospital, SimC – Simulation Center, Odense, Denmark
2   Odense University Hospital, Department of Medical Gastroenterology, Odense, Denmark
3   University of Southern Denmark, Department of Clinical Research, Odense, Denmark
,
Finn Møller Pedersen
2   Odense University Hospital, Department of Medical Gastroenterology, Odense, Denmark
3   University of Southern Denmark, Department of Clinical Research, Odense, Denmark
,
Christian B. Laursen
4   Odense University Hospital, Department of Respiratory Medicine, Odense, Denmark
5   University of Southern Denmark, Respiratory Research Unit, Odense, Denmark
,
Lars Konge
6   Capital Region of Denmark – Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark
,
Stig Laursen
2   Odense University Hospital, Department of Medical Gastroenterology, Odense, Denmark
3   University of Southern Denmark, Department of Clinical Research, Odense, Denmark
› Author Affiliations
TRIAL REGISTRATION: Prospective study at https://www.clinicaltrials.gov/

Abstract

Background and study aims Operator competency is essential for esophagogastroduodenoscopy (EGD) quality, which makes appropriate training with a final test important. The aims of this study were to develop a test for assessing skills in performing EGD, gather validity evidence for the test, and establish a credible pass/fail score.

Methods An expert panel developed a practical test using the Simbionix GI Mentor II simulator (3 D Systems) and an EGD phantom (OGI 4, CLA Medical) with a diagnostic (DP) and a technical skills part (TSP) for a prospective validation study. During the test a supervisor measured: 1) total time; 2) degree of mucosal visualization; and 3) landmarks and pathology identification. The contrasting groups standard setting method was used to establish a pass/fail score.

Results We included 15 novices (N), 10 intermediates (I), and 10 experienced endoscopists (E). The internal structure was high with a Cronbach’s alpha of 0.76 for TSP time consumption and 0.74 for the identification of landmarks.

Mean total times, in minutes, for the DP were N 15.7, I 11.3, and E 7.0, and for TSP., they were N 7.9, I 8.9, and E 2.9. The total numbers of identified landmarks were N 26, I 41, and E 48. Mean visualization percentages were N 80, I 71, and E 71. A pass/fail standard was established requiring identification of all landmarks and performance of the TSP in < 5 minutes. All experienced endoscopists passed, while none of the endoscopists in the other categories did.

Conclusions We established a test that can distinguish between participants with different competencies. This enables an objective and evidence-based approach to assessment of competencies in EGD.



Publication History

Received: 20 October 2021

Accepted after revision: 30 March 2022

Accepted Manuscript online:
01 April 2022

Article published online:
10 June 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Beg S, Ragunath K, Wyman A. et al. Quality standards in upper gastrointestinal endoscopy: a position statement of the British Society of Gastroenterology (BSG) and Association of Upper Gastrointestinal Surgeons of Great Britain and Ireland (AUGIS). Gut 2017; 66: 1886-1899
  • 2 Kotsis SV, Chung KC. Application of the “see one, do one, teach one” concept in surgical training. Plast Reconstr Surg 2013; 131: 1194-1201 DOI: 10.1097/PRS.0b013e318287a0b3.
  • 3 Ekkelenkamp VE, Koch AD, de Man RA. et al. Training and competence assessment in GI endoscopy: a systematic review. Gut 2016; 65: 607-615
  • 4 Khan R, Plahouras J, Johnston BC. et al. Virtual reality simulation training for health professions trainees in gastrointestinal endoscopy. Cochrane Database Syst Rev 2018; 8: Cd008237
  • 5 Sedlack RE. Validation of computer simulation training for esophagogastroduodenoscopy: Pilot study. J Gastroenterol Hepatol 2007; 22: 1214-1219
  • 6 McConnell RA, Kim S, Ahmad NA. et al. Poor discriminatory function for endoscopic skills on a computer-based simulator. Gastrointest Endosc 2012; 76: 993-1002
  • 7 Ferlitsch A, Schoefl R, Puespoek A. et al. Effect of virtual endoscopy simulator training on performance of upper gastrointestinal endoscopy in patients: a randomized controlled trial. Endoscopy 2010; 42: 1049-1056
  • 8 Lineberry M, Soo ParkY, Cook DA. et al. Making the case for mastery learning assessments: key issues in validation and justification. Acad Med 2015; 90: 1445-1450
  • 9 Cook DA, Brydges R, Zendejas B. et al. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med 2013; 88: 1178-1186
  • 10 Cook DA, Hatala R, Brydges R. et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011; 306: 978-988
  • 11 McGaghie WC. Mastery learning: it is time for medical education to join the 21st century. Acad Med 2015; 90: 1438-1441
  • 12 Cohen ER, McGaghie WC, Wayne DB. et al. Recommendations for Reporting Mastery Education Research in Medicine (ReMERM). Acad Med 2015; 90: 1509-1514
  • 13 Barsuk JH, Cohen ER, Feinglass J. et al. Residents' procedural experience does not ensure competence: a research synthesis. J Grad Med Educ 2017; 9: 201-208
  • 14 Messick SA. Validity. 3 edn. New York: American Council on Education and Mac-Millan; 1989
  • 15 Downing SM YR, Haladyne TM, Axelson RD. et al. Assessment in Health Professions Education. Routledge; 2009
  • 16 Cook DA, Lineberry M. Consequences validity evidence: evaluating the impact of educational assessments. Acad Med 2016; 91: 785-795
  • 17 Bloch R, Norman G. Generalizability theory for the perplexed: a practical introduction and guide: AMEE Guide No. 68. Medical teacher 2012; 34: 960-992
  • 18 Koo TK, Li MY. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J Chiropr Med 2016; 15: 155-163
  • 19 Jensen JT, Savran MM, Møller AM. et al. Development and validation of a theoretical test in non-anaesthesiologist-administered propofol sedation for gastrointestinal endoscopy. Scand J Gastroenterol 2016; 51: 872-879
  • 20 Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs 2000; 32: 1008-1015
  • 21 Downing SM. Reliability: on the reproducibility of assessment data. Med Educ 2004; 38: 1006-1012
  • 22 Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ 2003; 37: 830-837
  • 23 Spada C, McNamara D, Despott EJ. et al. Performance measures for small-bowel endoscopy: A European Society of Gastrointestinal Endoscopy (ESGE) Quality Improvement Initiative. United Europ Gastroenterol J 2019; 7: 614-641
  • 24 Fitts PM, Posner MI. Human performance. Grenwood Press; 1979
  • 25 Taylor JA, Ivry RB. The role of strategies in motor learning. Ann N Y Acad Sci 2012; 1251: 1-12
  • 26 Edwards WH. Motor learning and control: from theory to practice. Belmont: Wadsworth; 2011
  • 27 Park WG, Shaheen NJ, Cohen J. et al. Quality indicators for EGD. Am J Gastroenterol 2015; 110: 60-71
  • 28 Konge L, Arendrup H, von Buchwald C. et al. Using performance in multiple simulated scenarios to assess bronchoscopy skills. Respiration 2011; 81: 483-490
  • 29 Yudkowsky R, Park YS, Lineberry M. et al. Setting mastery learning standards. Acad Medi 2015; 90: 1495-1500
  • 30 Kromann CB, Jensen ML, Ringsted C. The effect of testing on skills learning. Med Educ 2009; 43: 21-27
  • 31 Madsen ME, Konge L, Norgaard LN. et al. Assessment of performance measures and learning curves for use of a virtual-reality ultrasound simulator in transvaginal ultrasound examination. Ultrasound Obstet Gynecol 2014; 44: 693-699
  • 32 Shirai Y, Yoshida T, Shiraishi R. et al. Prospective randomized study on the use of a computer-based endoscopic simulator for training in esophagogastroduodenoscopy. J Gastroenterol Hepatol 2008; 23: 1046-1050
  • 33 Di Giulio E, Fregonese D, Casetti T. et al. Training with a computer-based simulator achieves basic manual skills required for upper endoscopy: a randomized controlled trial. Gastrointest Endosc 2004; 60: 196-200
  • 34 Jirapinyo P, Abidi WM, Aihara H. et al. Preclinical endoscopic training using a part-task simulator: learning curve assessment and determination of threshold score for advancement to clinical endoscopy. Surgical endoscopy 2017; 31: 4010-4015
  • 35 Preisler L, Svendsen MBS, Nerup N. et al. Simulation-based training for colonoscopy: establishing criteria for competency. Medicine 2015; 94: e440
  • 36 Cold KM, Svendsen MBS, Bodtger U. et al. Using structured progress to measure competence in flexible bronchoscopy. J Thorac Dis 2020; 12: 6797-6805
  • 37 Ritter EM, Taylor ZA, Wolf KR. et al. Simulation-based mastery learning for endoscopy using the endoscopy training system: a strategy to improve endoscopic skills and prepare for the fundamentals of endoscopic surgery (FES) manual skills exam. Surg Endosc 2018; 32: 413-420